Hangout on air en español para Webmasters

Google+ Pinterest LinkedIn Tumblr

hello everyone good afternoon how are you He talked to Felipe 10 and now we are we go here from the service team and google webmasters support we are here for what we hope will be one from many from many hangouts anr in Spanish for all who own content management sites and you are in for the Spanish speaking market We have received your requests we heard loud and clear and here we are to answer all our questions you have passed us many questions by google for figures and we will attend 11 o'clock but if at any time you want ask us things live we are also happy to receive your questions and if they want too you can share your questions in the chat in the chat the hangout video and we will also be we will be paying attention to make sure we don't miss any question to appear on google + but now we have enough questions substantial that we have that we have contributed some in the in the forum day and flash so let's start if we go there come on well we have the first question Pedro Luis says in the version before the dot com flash amp field of a web internal linking must real must be done to the url amp of the pages or to the canonical url the answer is that either in terms of what who is looking for a google for the indexing and for representation to users because in the end if the canonical lization and the connection between the amp versions traditional normal web are done well google will identify what is the appropriate one to present to the user according to what tool display is using if it is using a web browser or be a once using an experience that you have support both and maybe I can have importance depending on the experience that you want to give user that is if a user is watching an amp page and you have a link to another good page is user is already visualizing in the then make a link that goes to an amp page can give a little more experience consistent but at the level of what it is user experience indexing understanding of the pages does not have importance now at the level of at the level of what is crawl press search that said and one of the things that is possible to do is that the version is the main version for everything there are several sites that do in which case it eliminate the problems of the canonical alternate versions and are removed also those doubts of what pages link because simply a link that is the amp version which is the version established that appears in all and that It comes quickly in any browser so if that works for what they are trying to do that sometimes not it's bad practice phenomenal still on the rise do you have any question of what I was counting on Felipe, you can ask a question or write us in the chat we will go watching we go to the next Maria tells us and we meet him problem that we update the content on the amp pages of the google results and not not only cool down after several hours when the information is no longer valid what actions can we take to that google amp cache refresh at the moment and good for that there is a tool particular that is called the optics is it send a signed order to a url specific to cancel the content of the cas that for a domain in particular and refresh it if there is a version then there is a specific mechanism to cancel and to cancel the case there is more there are several steps that follow that has to be signed with one with a digital key associated with main domain in which case and then it has to be sent with certain parameters to avoid to avoid repetitive manipulation there are more details on the page of app developers on how use the documentation document does not exist is not in Spanish yet but we will put the link to that documentation in the responses by commenting on video comments and then the I will send so we will send by example if by twitter after this perfect says ana fernández a question I have yes I share images of my property with providers this action may involve some type of content penalty duplicate there is the possibility of add canonical to the image let's answer it in two ways first yes it is possible to put a real canonical in the http header of an image at the moment of serving one's own image but google doesn't use that information that is to say that food we don't look at it we completely ignore it you don't have importance in the end is not the image itself is the context in which it represents that is if a image that comes from syndicated content a force or source forces shared that is appearing a page that is basically exactly the same to other pages then those are equivalent pages but if it is a image that is being represented in another context in another in a gallery of photographs in another in another experience for users then there is no need to there is no need to worry about built content situations nothing because style that is in particular there is much there many people who are very afraid of the idea of ​​duplicate content and the fact is that sensitive content is something that is a manipulative technique but it has to to be intentional is to say is not there are many people who care a lot that there is a part that is medium similar that is not content to have a page which is essentially exactly the same to another place and made to be is a scale to try to manipulate those google algorithms if you're not doing that and it starts and if you are not doing that kind of that kind of effort then the worst thing it can happen is that you are simply diluting the user experience and that search results cannot being optimal person implies that there are some type of negative action that is I'm going to take google in relation to that yes indeed this is a question that we they do a lot in perfect because another damián's question what will happen to the sites that still have the fragment count exclamation to use it with you have to call the hashtag in English if a group always says you have to focus in users and not in search engines but your site is perfect for him user and they can see the content what comes after him making use of ajax so that no refresh the entire page but for the search engine and for the search engine that content does not exist should focus on this case in google search in particular and the important thing the recommendation we make there for repeat it was a little am several years ago years when they began to develop websites that use technology ajax which is that which allows the Page content is refreshed or portions of page content are cool down without having to reload the page completely which is how google maps works how works many modern applications today we had to develop a scheme to be able to see that content and index and power represent it to users and then there was a particular scheme that the rk We make recommendations to users to gestures site owners to to see how to navigate that site there is for ccoo so that we could detect the content and index it correctly in October of last year We announce that we will no longer use that scheme because not anymore necessary to say before google didn't have way to see those pages that dynamically updated as the a user would see for our technology has advanced quite a lot since 2007 when we first announce that technology now when we browse a page and we navigate a site we see it and we interact with her in such a way as if a user saw it is to say yes if a user were viewing the page and was or was browsing the finger links in a mobile device so we would see it we and so we see it so we can have more ability to represent content that comes from javascript and from and from interact with him so the question is for the sites that still have that fragment that indicates this content is loaded dynamically what should site owners do that they already have so that google doesn't lose that content and the important thing we have done is for generally there should be nothing that change that is if the user is having an experience that allows you access that content should not have no there must be nothing to change on the side from the owner a site for us we can see the pages but it is important to confirm that you are functioning correctly that is use google dates tool within 6 console to see how google go or see the page and how could interact with her is say yes yes yes there are some of those pages that in a interaction makes access to a content which is dynamically recharged is particularly see if google would achieve see that page and if there are certain pages where do you have a specific address which is indicated with the hashtag with the numbering numbering exclamation let's see if google has a mechanism to discover that url inside the pages that is tracking on the site if Google can't find that link or that point of interaction because it's not crawling the page that includes it so maybe we will not see that page at some point and yes yes and yes yes yes we are seeing it and if we can see that url and if they appear google dates but not doing well in their young logs in any way is being indexed then maybe there to see if there is any blockage if any some other way to do google know that that url exists through a site map through an index of links or something similar said that said usually there should be no no change that has to be an owner of a site to be able to so that google can understand your site if this ajax content is loading and I hope that answers the question but I know that I know that in particular is necessary if something is not working well or yes yes if they detect that google is not detecting the content for that we can support them in the forum in the forum in Spanish also because sometimes debug you are in this situation does not require a interaction and a little interaction plus repeated to be able to ask questions and answers so for that we are in the forum and in case this don't answer the question there is a detail specific there we can support Perfect Question Miguel Angel Perez Yes I have to import for myself commerce a large amount of my articles provider via feed and those articles already are indexed by other vendors how would you recommend importing them in know index in the long run can this be harmful he knows that the ideal is create original descriptions but they are many products and many of them they have no more information than Provide the provider that advises we have to be aware when we have indicated contents that we come from other sources then what is not because have to worry about problems of slogans of webmasters or that I know what let's interpret that as some negative technique is more because when google sees many pages that has the same content at some point has to choose which one will show search results and usually those who are receiving syndicated content maybe they don't go like this, say reid al seem to be or will not be the that your position or usually is the original source that should appear what that we recommend to those who are importing content is that they include a link to the original article if it is possible do it or that use the new index to not generate that page situation Duplicates where all that content It ends up being and it ends up being considered as equivalent to and if that generates problems with the presence of pages because if for example you have a only quite big month are you waiting for that one to appear product page as a result search for users to get there you have to see maybe also what value added you can add if it's just take the same content that was coming from another from another provider that comes from a in commerce where basically the page is equivalent and where the only difference is that he is selling it selling to and the other one is selling go for a user there is no difference and no is creating value then it is worth determining knowing that it is a huge catalog amount of information sources huge what additional value can you obtain or offer imported content to be differentiated and maybe then don't have to worry if this is a page that is very similar to the other then there is no way to determine which is more relevant in whose case maybe neither appears what do you put with the third great here I will pause if someone have any questions you can ask us through the chat without problem we will see this question from actions manuals in recent months alejandro have noticed that manual actions generated by link problems artificial are no longer so common as before today are the algorithms those who are in charge of one hundred percent of determine if site links already whether incoming or outgoing are natural or artificial definitely not a hundred percent algorithmic detection we make of manipulative links are unnatural or artificial but we have improved very much in our ability to determine algorithmically what they are the links that are no longer there arranged in a manner and manner organic in which case if the volume maybe in certain regions they are true in certain or in certain sectors of manual actions to adjust what not is detecting the algorithms maybe it has decreased but still what we do we still have a pretty group active that is actively looking at those patterns that is also not that they fail capture the algorithms to take drugs to the eyes necessary so that he is not is giving weight to started links that they are not put there organically which are artificially placed with the intention to manipulate the positioning in organic results what we have given what we have realized and we have all commented is that in the environment of in the environment wishes that are doing consulting positive to many have already moved away from try to recommend what does ten years it was recommended that it was a species excessive handling going to load links everywhere does huge networks of those exposed as this is a way that for a user did not generate any value and for google clearly they were xconnect they had intention to confuse the signals and already there are many less there are many less consultants have been formal and established that are recommending that and without giving them synths and imposing the necessary clarifications that it is original content that is content organic of which are indicated what are which are the links that are put there because the exchange of consideration or exchange of profit sharing of some kind then if we have seen that it is reducing this practice has not completely reduced then still we continue we continue to do making efforts so that manipulation has no effect and so that users do not suffer as a result this kind of manipulations but so general if we are seeing a decrease in terms of owners of quality sites that are doing that kind of things because they are receiving this kind of advice that was not good advice then and that at least the environments has realized that it was not good advice and that there are better recommendations to make in terms of generating a good brand generating good content offer original content useful and appropriate to the public to the public and the segment being treated of generating with presenting content to after all is the important clear in relation to these links ask us Maria Laura about links to your site from your networks social she has a blog and has different segments of your audience in different social networks and says it is as he asks if he could generate some kind of problem or penalty and want to spread your site without affecting to the one of sales I would not worry is say what is what you are describing mario maria laura is it something very natural is something very common and is something very logical that is there are several channels to make someone discover google content is one of those channels Google is one of those channels that direct traffic to a site but there are other ways to discover content and if those links to the content that are generating on the open web you're not doing it so people find out otherwise you're doing according to the slogans of the places where exposing them which are problems of those of those of those platforms and what are they have they will have their rule about that property and that is not you should not have a problem now general when we are when we are looking link issues is when you are making scale with the intention of manipulation and that is that is that is understand is to say if someone is making a recommendation must be done all these links to upload in the climb in the positions and that already in itself has have has bad air is say they have already coming in Well, it's starting on the side wrong but if that is what you are describing is I have my content and I want several sources to discover that's why several sites promote it and that generate links that's that's normal and natural and if one is being scaled automated way with the intention of manipulate using the channels what we detect and we discount it then I think not is that what you're describing is not you should this one you don't have what worrying helps me and good luck as for this we already change again a user wants to know if the module featured news or amp carousels search if you read any google data news as the name of the source or other or is totally independent or independent ie that is the carousel of the carousel of featured news the carousels of amp are are google functionalities search search organic search in which it appears that does not appear there to that is represented that is not represented and they are decisions taken as a function of google search experience google news is a separate product is a different product has an experience different has different criteria in terms of inclusion and appearance of course they are responding to situations to similar situations but but in particular decisions about what appears in featured news what content a page appears they are taken differently but I I would say that if we see it in a way bit more logical if we are seeing that it is a page a page has a title in a page has a summary a page has a picture has a has a has an author has a faith has a date of creation that information will appear logically in any of these experiences but how do you extract how do you presents how it is positioned and if google At some point I may make a decision represent it or format it different in one experience the other already will depend on that product and they ask us also if when you put the newspaper brand the module outstanding news says in america the most recent domain.com because they appear in various media in the news because the media come out in several news from days ago if google news it's recent because they are experiences different as explained google news brings a different experience has a different offer to users has a person is promising something different than what we are offering in google search in relation to news highlighted because what you are looking for that users would be looking for is different the objective here if they are information that may be different then to him I think for those who are interested in appear in the carousels of in the news carousels and being that there it's worth somehow to understand that google news and these are operate from different way but like like google we see the quality criteria of a very similar way is to say sources from reputable sources with content updated that is well informal and that is that you are well informed where they are science sending good good good signals to google on how to understand the content how to understand relationships between different versions one of the questions that left me yes yes they see the thread of the thread in the flash we had a conversation with with with the site owners of a site of sports you have has editions in many countries and in many languages ​​and where they had a problem of a migration and that made of a unified version where they had distributions in different languages ​​to different subdomains you didn't have to maybe he didn't have the success they wanted in terms of the operation there were some technical problems gener and that they sent confusing signals to google of terms of what the versions will be corresponding to the same article in different languages ​​for different countries in the end a lot of that is still something what are you looking for solve because the appearance in the carousels of top stories is not first that everything is something that is algorithmic and is determined with quite different criteria of particular quality ensuring that it is truthful report and information fresh clear information from sources of reliable sources but also the site has to be well established and well tabulated for ensure that the user is going to have a good experience when i get to it and when the signals that google receives about the connections between pages alternate that is there is a version of a article about something that happened and is written in french and that same news is there is an equivalent item with equivalent content that is written in Spanish there is the possibility of establishing relationships between those pages or not to say these are these are versions equivalent two men and are are versions that may be aimed at different people or maybe choose no do it because in the end if someone is looking for that content in french or is searching in spanish the one that the one that the one that appears in the search result the version in French for those who are looking for it in Spanish for being in France maybe not be the most important but that kind of that particular the situation that was we were conversing with with with our with those with the owners of vavel.com which is what we are mentioning in particular what is there in particular we see in search with just a quantity of sonsoles notifications the tool that we use to send to the owner site notifications about which is causing problems what we understand as situations that generate problems and there are times that we are generating indications for each and every one of the pages we have seen sometimes we are generating indications for sample page samples where we have encountered problems that maybe they generate problems and these are quite strong indications that is there is a problem and it is worth giving deepen attention than worth worth solving and it is important pay attention to them because it is difficult for us for example if seriously there is a problem within google if in seriously we have something that is not worked correctly it's quite difficult for us understand the problem debug the problem give and hit the cause and then have an interaction with our team our colleagues in engineering and a product to improve it if recommendations we already know things that we already know generate problems yet are still to be resolved then always the recommendation for when someone says I'm not showing up where x where I think it should not appear many articles as I think that should not Google is not seeing my articles always my recommendation comes first trying to see what they can do by your side to solve it because that's the best answer in the end if there comes a time when we achieve we detect that it is a problem by our side if you buy us methods to solve it but the resolution can take a lot weather and maybe it’s even higher difficult because there is 0 0 0 percent in your hands to find an alternate solution but that's why also that when when we started and we chat in forums or in conferences or in this type of in this kind of channel that no you don't see that that we seriously do not see that it is a problem on the side of google that's a good answer because it involves that if there is something they can do for their side if there is something if there is one if they have the control over what over how to solve luck how to solve the problem and always the recommendation initial is to see that this pressure of consolidate attention and find the resolution or clearly determine this we already solved it with this is not a problem but it's hard for example when we discard the identification is like that can't be because well in the end if we who understanding what that we understand by seeing what we can see We say you're paying attention to that could solve the problem then that is that yes I think it's an approach a little more a little more is more likely to succeed directly we are not in shorts compson because then we don't receive none of these messages and we get lost a lot of communications that we could have a lot of information what we have so easy so accessible then we recommend that you verify your our web pages in envelopes with show something yes yes and that is and that is the main thing the first entry point of the first point of departure is search consul that since notifications you have to alert site definitely and if maybe not we make it clear enough if maybe we could improve as represent them please let us know for us also present it to sometimes it is there are quite a few reports several areas where you have to look and maybe yes with your with your a with your information that you can give us about which experience they preferred perhaps We can help the team you design the tool to improve the experience we are doing experiments in the new version of this case it is even possible at this point to try to see it true in certain cases and if possible they would see a invitation to try to see it on their case and that is the this is a good good opportunity to also see what information is useful to them how useful it is to see it and from there we can route that feedback to our type and indeed if they want to do recommendations to the search team cause or they can do it directly in the new tool itself if they have access to it there is a send function feedback which is what they call it in English and they can do it without problem and the team in fact is very receptive and thank these comments because they are working on it right now we have more questions in general is if we continue come on ask us daniel pinillos has two doubts one transfers the same page rank a link in javascript that one accessible that is the angel you already see script a draft click efe it is a link that is within a visible to a user as in the center of a page versus a link in link and as the theme bar say yes they are looking for nothing else happens because the googlebot if you don't interpret it a link is a link if google what he manages to see and I manage to navigate to it interpret and identify but that's where it's important see for example the date tool google because if it's a yes it's a url that is not visible to a user who in this bet and so that a user can click that does not appear in any moment but that a user can can press it then a user is not going to be able to interact with him then that goes clearly indicated the this maybe not don't be an organic link that indicates a user connection use a user if there are links that are of that type that dynamically generate that of some google way is not detecting there we may have to see if there is something on this page display which is not correct but usually that's not where it's just generating problems from what we understand and the second question you have is yes we load a link after the render of the page that link generated after passing a page rank as well as one visible in the first sell completed and I 2000 is would stop completely from the notion of if there is page rank or not page rank because if you are wondering about page rank in terms of representation of link are starting by the wrong the link has to be for a user the link must be for a user can find a relationship between one text and one content and another not the link does not exist and just to tell google and here is something else that you can see so if we start on that side the heaven if the user can find it as an ordinary user google ball should be able to find it as a ordinary user in which case I would sail it but the decision of yes pass or not pass page rank is completely separate from that if represent with ajax or not that is to say it is more because more having to see with what is more having than with all other factors that include in the beacon but if the question starts from that side is worth asking what is the goal to have first because perhaps the intention of putting the link there isn't to offer you something useful useful to someone who is browsing the page and if it's starting from there so it might not be a problem if pressure goes on which is perhaps a problem of slogans then there we have to see a little more care exactly and ask the child how far it is effective use of the veil and with next 3 pajin actions if there are no links to those pages for example in mobile versions is very frequent that those links are not painted but if add the next one and the pref says it is known that the ideal is to have those links but and if you can't and if you can't and if there are no links then just from indication for a link with what ignore it is not saying yes yes it is if there one if there is an indication that perhaps there is a value here but there is no value included ie there is no there is not one rule that indicates and then there is no nothing for google google to see the worse in this case simply on google and take out there is nothing to see hence it has negative impacts depends on the boat my imagery that depending on the volume term of that kind of experiences and how frequent riches both indicate that there are other types of problems like that where there are things incomplete to where there are basically complete code where general html which is difficult to interpret in whose case maybe I can and can be more difficult to understand and understand the content and then try but at general I don't worry about him and I have another question say why when blocking by means of point robots textile or places that are not that no want google to track why not contribute nothing up the pages also indexed is because it tracks other urls or it's because those urls before google don't had indexed but now when blocking the accounts as indexed to the no know what's that doesn't sound like a screenshot if you want to take a look pages blocked by theft and total the hexa goes up and well I what I I wanted it to be if there are so many pages blocked because maybe it's because they are being generated from an automatic or semi-automatic way and then you are using wolves or robots so that pages generated from semi automatic way do not index what that I would imagine in this case is that there is still a process that is generating additional pages that are that can be discovered and when there is a volume on google not basically going watching seeing it how the divide is divided volume of traffic that google would generate to a site that is here is a amount of ay ay ay google is not going to try d fill or saturate the ability of a server ever we what we want to do is simply to understand content without generating a cost operating the owner a server then we behave like behaving like a common as a respectful tracker of space then not part of that is that we are not going to try to saturate a full site especially if it's huge because that generates a load when there is a number of pages that are blocked then the volume of him that were being seen then the volume of other pages maybe we can see maybe it goes up because then what we had dedicated to see these pages we are going to use this for good other pages that maybe we hadn't seen so often or that we had found either links but nothing we had not visited then I I think that would explain that but certain way when I see a site that It has such a large volume of urls blocked for the browser and has so many pages indexed and that usually implies me that there is something in the generation of content that is automated or that is generating very several urls for content which is very particular or several options particularly because it is a site quite quite large and complex but I think that I would see on that side that just now there were a lot of resources dedicated x to track and that is already indicating by these layers pages to see others and ask why the world why reason i'm not in ex ando new articles on a new website and on websites with more seniority apparently the google robot can access and have no problem that says he uses explore like google the site I go useful is there any problem with indexing lately on gw te web serious attitude Well, the main thing is that if you are looking for your the particular page using one of the operators be and not rl or the parador site to see the page if disappears in search results is say if it appears when you make a punctual search for that page and nothing more that page is indexed that search results appear in a particular position already has more to do with the content and reputation of site with the amount of with the revealed that the content is sought by the development etc yes he if google is not tracking the site that is not discovering the pages and not to your body just have not visited then there there must be some problem but for example there are beings with soul jazz and soul extension if there is a sitemap or if there is once an initial url llull already you should be able to discover the other pages and from there determine there are certain times that certain that there are pages that do not we have interested because we cannot try everything we have to do elections sometime but so general those are pages that would have be a 404 the day of those empty pages basically or they simply have It has a little content in it the top is and the rest is simply empty or space space by fill out whose case maybe we choose not index the but if it is new content new pages and google book are dragging and are appearing in search results when searched that specific page then you have to compete with relevant products Y there are many questions and we have a lot of questions below also a lot of people then let's see could you comment on the update of March-April known as corot age says which affected in the cities of google.es we have turned out that website without hardly authority and very poorly optimized getting stop positions and boites very relevant well we so general we don't make announcements of algorithm updates that we stopped doing it several years ago and there are many in the industry that decide give them names current rates and updates but usually what that I see reading those that sometimes they conclude that it has changed something they are trying to do an interpret of where it came from or how came because of that I would have nothing to comment or what to add about that because in the end we don't comment on changes we are modifying the algorithms of in a frequent way we are improving those in a frequent and constant way then every day somehow change something ask how many of addresses 301 are the limit so you don't miss the validity of a page how many 301 redirects are limits so that do not miss the validity of a page How many have I asked you? how many 301 redirects are necessary for a user to reach the content what you are looking for is to say some way yes yes they are redirection this 01 that have long demanded and where you are simply trying to keep a string this url was and then the network and how it gets there is no longer is then those two happen and then we have three and then they pass there comes a time when those those old rules no longer have references organic nowhere then maybe those three those 301 could be a 3 10 or they could be a 40 or they could be a 4 40 44 10 that already disappeared we no longer let's worry that said I don't know if there is a fixed limit articulated number where I say yes so they went from x number already there is that ay oh ay stop but what I would say is that if they feel which is more crisis and if you are finding that you are having many addresses 301 and you're worrying if that's a indexing problem worth look if it's a problem also for had with his fucking resource consumption in own server every 301 is a request http every 301 is basically are hitting your server and you are trying to find a page the corresponding send it and yes you are simply making a chain of 301 at some point you are consuming means and those resources are things that you fall for you and canetti not only when google votes look but when a user arrives to those pages then somehow maybe go back to see what value they are generating those existing rules and maybe just simply instead of make a chain of reductions redirect all individually to a final site and leave it there left for granted ask also my site looks good for users with the use of ajax using the pad fragment exclamation but now for this google it's obsolete in the end I have to focus on search engines and not on users since I must make changes to the search engine and not for the user that would be my is that is the question that we were what we are responding to initially it was yes yes yes using the tools of sensors with show are you realizing that google when you see the basic url that would see a user is not seeing the content that a user would be finding in that situation and if at the time of interaction dynamic with that page you are not finding the urls then first there are other ways let you know googlebot these urls there are for that there are more for example and where they don't have to be pre pages rendered can simply be links that would generate an experience for a user who falls with that link of that page of that content is to think about it in a way not like they see it googlebot but as a user that has a linked url should that content and how a user discovers it and maybe at that time use sitemaps to make it easier for google to discovery of those links or those pages if there are no sources external in other places where google can find those pages or those addresses are particular to to graze and to enter we have 15 minutes the truth this is good Alejandro asks if a page is publish images that were already previously published by other sites this could affect negatively since that would not be an original image depending on what is the reason why which one is showing the photos for final we will wait for the main answer is that he is which is the value that the users and it's a unique value or not and in the end that is what determines relevance I I ask a lot of people wondering what if google is going to punish someone google don't punish google take determinations of relationships and relevance of a page to a user's search so when do you have your images from another source if you're right to be there if it has context and if has any utility for a user then that page is interpreted to such way if it's just let's download a number of photos from another site to generate a copy of a catalog and just put a 1 a different sign then maybe I don't know it concerns concerns of actions that go take google not in terms of position the lowest is simply say what value the user brings yes i am looking for if i am looking for content an image appears it takes me a page that doesn't Give the information I'm looking for I get frustrated is an experience that are giving user start over there always get in the right way because because walking around they are not going to to be very worried that he is going to generate penalties or difficulties or risks or activities we take we make manual adjustments and adjustments algorithmic based on relevance and I think to worry so much about that that if google and google is going to take some negative action because maybe you are doing something like that if you're on that side that because you're doing it on scale or because you are worrying and seriously yes you have to worry that you are doing something that is not useful for a user or are you worrying about something that seriously maybe new that worry because in the end there is something of value happening a user and there is already a thing to determine which users what value what content are they looking for and why your site versus the others that they exist that may have content similar or similar relative or consistent or contrasting if you asked us about urls there is a maximum length for urls in your site has sometimes they are very long but content in your organization so requires and sends greetings from mexico health is equally we are here in dublin and here the sun is almost like in mexico but not and in mexico it is very soon so thanks for connect to work I no no we don't have a clear indication of a maximum url of maximum length i what i would say is if or if google is interesting that page is already tracking then you don't have to worry if google is not visiting is not looking at it and that is page that has relevant content and deep then maybe maybe there some limited some limit theoretical we are getting there but if the content manager the content manager requires what requires in general the urls that are more interprets bless for a user are more useful for a user if those are huge reasons to have urls women with quite long maybe have codes may have hashes after indicating options customization or adaptation then I wouldn't worry so much on that side if google is dragging the pages if i'm not tracking them use google using dates later maybe determine if it is that google is having trouble accessing the page so googlebot has a problem accessing a long war that's not going to be google vote the only thing I have problem users may have had that problem too let's see this question is about sandbox Miguel Angel says the other day he commented on a twitter that the sandbox as such does not existed but there are factors similar ones that act like a sandbox what factors are these is it possible that a web less than a year old can start and compete with websites that they have a lot of seniority and authority which would be your recommendation for sites with less old parasites with less antiquity is to identify which is the public right what is the right content for that add those users and determine what is the strategy of user acquisition in particular I believe that many they worry that they are not ranking because they have original content and original and unique and that should be the same but in the end it's also worth there what is the experience you had a user previously the reputation that have a brand and the confidence you have a user with that brand because that is going to affect the amount of links that users share with each other or that publish content elsewhere and a new place will always be creating that then you always have to have a bit of realism that is a entity that is present that is constant that has a brand that has a name that has a reputation established and already by itself because already is have already made that investment have they have already made that investment of creation brand creation experience creation of registration with user already by yes they have an advantage that someone who it's starting you have to work in general but that does not imply that that that new site can't do the one who starts a new site that has have a realistic perspective if this one is trying to get into a competed market with frame with with name entities or that have what they have a good audience not only him have unique content is very important there is no option to compete if not it is and if there is no single useful content informative that he generates delight a user but that's not the only thing too where you have to have a match in a determined marketing plan that have a user acquisition plan of melilla of the same promotion plan and all those things don't come by themselves they take work and take effort and they don't we try to discount but no one is it's not a magic recipe he appears just for being there because having content you must also know that take time to take time does not imply that there is some artificial control that limits the possibility of caudé that a new site appears that doesn't let maybe that doesn't make a real effect that is there is to be separate into what is a real effect and what which is simply a situation that is It’s so common that it might seem like whatever it was as it was was defined as a rule and I think we have time for the best a couple of questions more question linen about the xerez flan these questions we plans always arrive it can be complicated but oh ay there are times that maybe to maybe you think too sipse sometimes even if we have correctly implemented the gesture in flan for versions of different countries but same language shows us the erroneous results for example Spanish versions for the version for versions of colombia argentina they go out like google.es instead of the version to dissipate the versions are exactly the same and because of business cannot be distinguished the content but they must have their own version how can they correct it and a solution it would not be easier to give more weight to geolocation in google search console tip well you have to separate the two things between which is which is which is the function of the saying in terms of establish relationships between versions of a page that are equivalent but that vary by language what is the importance of the geolocation that indicates that this page or this version of this page or this site is more specifically relevant for geography users in particular a particular country and those are things that are seen with different pesos is perhaps the geolocation is something that gives weight to the rest is simply something that gives signals and maybe if we see it that way it might not help us is tell the indication of versions of country and country language and relations between pages with him with him with him with he with the fact slam are indications for google that when have an option to introduce you to a user that is in a specific context where do you see that a particular version It is relevant but for the user there is a version that exists for a who for a language confirmation or the language and country to show them why the content owner and ccoo that this is relevant and those relationships and there are certain patterns that you have to keep in mind you have to be those those links you have to put both sides that is if there are four articles that are versions specific the same content but with versions country-specific for reasons of businesses to each one these four have to relate to the four is to say the alternate version of this for this language for this language country bar been the second thing to see is that it has to be for language first then you can add country in what they talk but if you can't do only for country has to be for language and country or language and those connections have to be has to be done in both directions and if you are having several versions then that map is multiplied that's where having that separated what is the other one I already have the legio of foc and although you could put in your rest on the same page itself indicating in the language bar to bar indication but from there you also have to see a lot the sometimes plays the domain is duty is to say there are certain there are certain domains that send a very clear signal reference geographic reference where it is so clear that there is no way to modify it in your console that is to say if someone has a point and in ireland that is an indication of anchors of content for ireland what instruction they are no way to change it to say no this is for another other country there are other domains of the country like puntocom colombia that in the end is already so associated with commercial that google nor he even sees it as headed Colombia see it as a generalized domain we have a fairly extensive list of in the health center indicating what they are the domains of the country that we see as clear indications of geography which are the generic domains that we see as generic and then for which of them we see as generic how to indicate geo geo specificity in consumption but returning to the question debugging the article experience particularly for users in particular that languages ​​are in particular can be quite difficult to the important thing is to see if there is one of the things we try to do is put notifications in your speech when we see that there is something that is confusing is to say if there is any notification about a url got tired where missing annotations directional that implies that we are not going that we will completely ignore all that is because no longer because there is something which is incomplete the map is not the map is not well drawn then of there we have to say that this map is incomplete then we are not leaving daniel because for him we will be guided by what yes we know what is the content of this specific page then if there is in notifications there solve them if there are things to adjust there are adjustments also identify in a certain way if it is possible to simplify because the simplification also helps in terms of administration and there are several sites that I would like to sometimes say we're not going to focus on country and language and make several versions of the country language and the map is multiplied in a way in a way quite complicated and administration can be make complex the pop sites become complex the headers that put all facts france fly complex then sometimes maybe determine if simplify the structure of seriously need to establish relationship between the version that is dedicated to belgium and the version that is dedicated to australia maybe not maybe they are property completely different they don't have to have a relationship between the two or maybe the version for spain and the Argentina version can be completely different but maybe the version for argentina Paraguay Uruguay may have to have some kind of relationship because it's going to have traffic between or movement of people between free among the different countries then there between a little more than which is to understand the public and the intention if we see this question a lots in the forum we received it many and I think it would be worth it also make a kind of particularly dedicated to that maybe because the situation of deferr flan and translation in particular in context of the Spanish-speaking Hispanic is something that we have to know how to do it right and it can become quite complex me I made a presentation about this in london last year and maybe we can also share it is more because if we can we sign up book and everyone else and surely well let's have a session of these more more thoroughly – than in any case we have a lot of articles that you we can leave now in the description of this video where you can read about heidfeld line and days a little more exhaustive of how to implement it correctly because yes that is true which sometimes complicates the situation and can be much simpler but what it looks like we have a minute I think I believe yes and I think that I think thanks first of all thanking all for the questions that we have done for that we are to answer them to the extent that we can always we appreciate when they are punctual and they manage to give us the url of your site is much easier for us in the context of when it's simply theoretical or abstract but the exact theoretical questions too the basic ones also answer we appreciate much he who not he who trusts in which we are here to answer your questions we have we have the support forum for webmasters in Spanish where we are We are willing to support you where there are a community of experts too external who may be able to support you that we cannot give because times sometimes it's a little difficult maybe communicate to someone who has a site that he is doing things that maybe are ideal or maybe not the best site that if I for us sometimes is a little difficult to approach someone is churches that your site is not very good but sometimes it is important that someone you can do it and for that there are members of the community that do it with a lot Tasteful and cyclic and with a very very very generous spirit and that is why we know that it is a good site to get support but for that we are also here to make sure that if they have how interact with us is the questions when when necessary yes thank you so much for making us get your questions again and your follow-up on twitter itself tweeted a lot and we value that a cane and well thank you very much like Felipe for dedicating your time because this is priceless guys good thank you very much and be aware go