Which Google Webmaster Tools Do You Want? (View post)Tony Ruscoe | Monday, September 18, 2006 18 years ago • 26,669 views |
- An option to easily remove URLs from the index (either individually or by providing some kind of XML sitemap file that only contains dead URLs). Currently, removing old pages from the index takes a long time and by telling Google, "Please don't bother even spidering this URL – it's long gone!" it would surely help them too. (I guess this could behave like the robots.txt does for URLs they shouldn't index – so once a URL is dropped from this new file, it's possible it might be indexed again...)
- A tool to test / track how a specific site I own ranks for a set of keywords / phrases. You can do this using third-party tools, so why not just build this into Google Webmaster Tools instead.
(And I'm still thinking... I'm sure I've thought of loads in the past!) |
Neil Richard | 18 years ago # |
I'm a sucker for reports and numbers. So any easy way to track page views, clicks, source, etc. is great. Google Analytics is a nice tool, but needs a bit more tweaking to make it perfect (like better funcitonality for blogs).
Aside from that, easier ways to make money. Look at how Revver does with videos. You the viewer don't pay up front, but the producer gets at least some kick-back. |
Alex | 18 years ago # |
For my person it would be nice to have a overview about duplicate content pages. Pages which are to equal to kick them off the index and rip down the whole page.
Its a little bit stressful to guess why a page with many pages rips down.
Alex |
Tony Ruscoe | 18 years ago # |
- Something else I'd like to know is how many people are subscribed to my website's feeds in Google Reader, Google Desktop or Google Personalized Home page. There's currently no way to find this information. (I believe other feed readers include this information in their HTTP headers or something so you can check your logs and see – you can also view this info in FeedBurner – but Google don't seem to release this data.) |
Mambo | 18 years ago # |
I think an easy one would to be intergrate the "Add URL" feature into Webmaster Tools, so the starting-up of a new website can be managed in a single place.
I like the idea of calculating feed subscribers.
I also think that there should be an automated check for accessibility – that covers some of the guidelines given in the help section. |
Tony Ruscoe | 18 years ago # |
- I personally wouldn't use this feature, but maybe there should be options to disable caching at a per-site level using Webmasters tools instead of relying on the META tags in each page.
- What about a "contact info" page that you could populate similar to what Alexa offers? I realise this could be open to abuse by spammers but if it was carefully structured and didn't allow HTML, it could be useful. (I suspect many people try to find a website simply to find their contact details – and this would make it much easier if owners of badly designed websites completed this information!) |
Sohil | 18 years ago # |
1. Way to transfer page rank from old domain to new domain. 2. Removing pages in Google's index. 3. A bit tweaking to Analytics. 4. Info on Dupes 5. Calculating Feed Subscribers.
Tony and Alex covered it all :) |
Domisto | 18 years ago # |
I would like to third the way to remove old pages. That would be great.
I would also be nice to compare search results of two or three sites at a time for your and their top 5 (or 10 or whatever) search results.
Someway of showing results of subpages or at least highlighting any pages within a site with drastically different content (seeing that one might want to target a certain audience with a page other than the index). |
Rishabh | 18 years ago # |
I would like a feature that would instantly give you huge pagerank and decrease the pagerank of every other website in the internet! But seriously, I would like a better interface to work in so it is easier to change the settings and stuff. |
Christian Lund | 18 years ago # |
* Some sort of quantity indication on "Query Stats" would be nice. Could be something like: number of clicks, number of searches. If that is too revealing then perhaps a percentage of total.
* A way to track how certain selected queries are doing over a period of time.
* An indication of how Google rates the importance of each "Common Word" found on your site.
* Top 10 pages on your site listed by page rank per month.
* More statistics...
|
NC | 18 years ago # |
I have to go with the majority of posts here and say I would LOVE a tool to quickly remove all the Supplementals of my site in the DB.
If they are in Supplemental then they are either, old, changed, or dups, why oh why do they hang around like an albatross around our necks, unable to be deleted, en mass, to fix up our search engine ranking.
When I made my first sitemap for Google I accidently added the post, the reply and the print versions. This action put my entire board into supplemental hell from which I cannot get out.
Google refuses to delete the data, and its too huge a mess to do each one by hand. |
t xensen | 18 years ago # |
I don't suppose there's any chance of G revealing the link that resulted in a "page not found." It would be helpful. |
A tool | 18 years ago # |
A lie detector – to detect when Google Webmaster Tools lies about the status of banned sites.
(Thinking it over – seems like this is needed for a lot of Google's communication with the world...)
|
douglaskarr.com | 18 years ago # |
HTML verification, XHTML verification, CSS Optimization, Accessibility ranking would be nice. It's becoming more and more of an issue and will surely take a spotlight. I realize these aren't necessarily Google Webmaster tools... but they are "Webmaster Tools". |
PiGuy | 18 years ago # |
I second what everyone else stated.
[URL removed. -Philipp.] |
Josue R. | 18 years ago # |
I would love to see a history of pagerank per page. Such that, it would show not only top queries for the individual page but also a weight/stats table against competiting & related webpages, which popular keywords were the target of the indexed page and which didn't get any queries or clicks in a monthly/quarterly/etc period. Also as part of the history of the page, what content was mostly viewed/read or popular by user-agent clicks and last time the individual page was last changed on website, last updated on google datacenter (indexed), last accessed by spider-bots, last approved by spider-bots & what didn't get approved (eg. invalid html/javascript) and possible warnings which may violate not just google search enegines but all standard practice search engines (as in bad SEO tricks vs. good SEO tips). |
Josue R. | 18 years ago # |
ok ok, i know analytics offers most of the above things i want, but i'd really like to have Sitemaps & Analytics as one fluid application which can also be customizable by the webmaster. |
Susan Geraeds | 18 years ago # |
I'd like to have information about duplicates and actual numbers for site statistics. |
Jayenkai | 18 years ago # |
Full IRC-a-like Ajax Chat, easily insertable into your site, but running off Google servers, since I can't manage to find a decent one for my site ;) Should work like Gabbly, but have actual Admin functions!
Ability to upload images to Google Images, so we don't get any more annoying "This imageshack image isn't here" signs.
Alright, so maybe they're not Webmaster specific, but they'd be handy none-the-less. |
Elias KAI | 18 years ago # |
I think they took into account my last call fro google spreadsheets features.(easy type in of adress.
To be enable to report the recent spams from this russian guy who end up with thousands of plone pages into first google SERPS.
To have one single vertical channel showing money conversions per keywords from organic searches and compare it to Competitors or Adwords Ads.
Thanks. |
pip | 18 years ago # |
detection of dupes or something which tells me which class of pages are only considered for supplemental and why would really be great.
at least at big catalouges/forums I tend to lose the overview. but since all of them are white hat I'd prefer google would promt me to do some tweaking on them. |
Philipp Lenssen | 18 years ago # |
I want a visualization of the internal link structure on my site. (Backlinks visualization would be even nicer, but I heard Google has reasons not to display this, e.g. the link operator only shows some links.) Something with circles and lines so I can see the homepage in the center, other hubs around it etc., all with page titles, clickable so they take me to a particular page. Possibly, also a visualization based on PageRank, top performing AdSense on pages, top searched pages and so on, all neatly done in Flash so I can move stuff around. |
Ionut Alex. Chitu | 18 years ago # |
Like this http://kr.webzari.search.yahoo.com/search/webzari?p=http%3A%2F%2Fblog.outer-court.com&ret=1&fr=kr-search_top&x=0&y=0 ? |
Ionut Alex. Chitu | 18 years ago # |
Also this: http://kr.webzari.search.yahoo.com/search/webzari?p=http%3A%2F%2Fblog.outer-court.com&Urlinfo=suburls |
Mambo | 18 years ago # |
Wow that's impressive. I wish I could understand it. |
killian | 18 years ago # |
A way to add another user to the webmaster account. This can currently be done by GA. |
Anders Dahnielson | 18 years ago # |
I second the suggestion of (X)HTML validation. Having at least that is a no brainer to me. Validation by spidering is difficult to do with satisfaction (besides 'Nikita the Spider' which is great) and usualy limit the number of URLs checked to 50 or 100 which is way low for a blog with many archive pages in addition to the posts. I have also never come across any spidering validator that take advantage of my sitemap.xml instead of trying to figure it out all by itself.
Sure validation of CSS and other stuff could be integrated too. But (X)HTML is the big thing. Having validators integrated into the Webmaster Tools is a great way to keep track of a websites "health".
|
Iolaire McFadden | 18 years ago # |
I would like to see a page rank tool. You have some page rank related information, but it is not really enough to get a gauge of how your content ranks. I would be happy to be limited to say 1 page rank test per day or something to prevent users from abusing the tool solely for SEO. |
Tony Ruscoe | 18 years ago # |
For all those asking for (X)HTML validation, Chris Riley wrote a validator that can accept an XML Google Sitemap as its input:
http://cgriley.com/validator/
|
Sankar Anand | 18 years ago # |
same like what sohil said, i wish all of this would come true....
1. Way to transfer page rank from old domain to new domain. 2. Removing pages in Google's index. 3. A bit tweaking to Analytics. 4. Info on Dupes 5. Calculating Feed Subscribers. |
Anders Dahnielson | 18 years ago # |
Thanks for the link Tony, it's great!
But I still think validation should be integrated into the Webmaster Tools, because it doesn't have to be "live" but can be integrated so that validation errors is reported among the other statistics (unreachable URLs etc.). |
Anders Dahnielson | 18 years ago # |
Sankar: I think using FeedBurner for feeds is a more efficient ways to get subscriber stats, than relying on Google, since that will catch everything. |
alek | 18 years ago # |
I'm a bit late to the thread, but ditto what most people said above ... although WRT validation, other great options are the W3C tool and/or HTML Validator plug-in to Firefox which uses Tidy.
Having said that, I can envision a really, really cool offering to be "show all pages that don't validate" which would do exactly that – heck, Google is spidering your entire site anyway, so that would be a nice benefit.
BTW, I noticed that the search queury stats seems to have about a 5-6 week delay for one situation I noticed – used to say it was an average over 3 weeks, but that seems gone now – consider adding language back in to highlight exactly what the expected lag effect of this is (right now, it generically says may not match current results).
Finally, on the PageRank display, rather than use none/low/medium/high bar charts, just show the numbers – gives the SEO's of the world more to talk about. Better yet, say this is the "current" pagerank (versus toolbar PR which is only updated every few months) and this will create major ga-ga in the SEO world! ;-) |
lftl | 18 years ago # |
Have the Google API return data that more closely matches the live data. |
TOMHTML | 18 years ago # |
I just want to know WHICH SITE does false links to my site. For exemple, if a site make a link to mysite.com/doesnotexist, I see the error in google sitemaps, but no more :-( |
Philipp Lenssen | 18 years ago # |
On a side-note, here are some ideas from Matt:
- full backlinks for a site - causes of 404 pages - broken outlinks on a site - more info on spam penalties - rank checking - ability to show/download all pages from a site (e.g. if server crashed) - ability to "disavow" backlinks to a site - tell Google the correct country for a domain - tell Google a parameter doesn't matter - verify an IP as Googlebot - fetch a page as Googlebot to verify correct behavior - diagnostic wizard for not showing/ranking - see feedback on spam reports or reinclusion requests - revamped url removal tool - reset or refresh the list of crawl errors - specify preferred time of the day to crawl |
Sohil | 18 years ago # |
This has been Digged
http://digg.com/tech_news/Which_Google_Webmaster_Tools_Do_You_Want |
macewan | 18 years ago # |
sep* real people hits from search engine hits. more information on subscribers. |
alek | 18 years ago # |
From Matt Cutts (via Philipp): "ability to "disavow" backlinks to a site"
Did Mr. Google basically just confirm that nefarious folks CAN hurt your Google SERP's via off-page factors such as linking from bad neighborhoods?!? ;-). |
David Hammond | 18 years ago # |
A way to specify which query parameters distinguish a unique page for a given URL. For example, if I have index.php?loc=foo and index.php?loc=bar, I want to be able to let Google know that those are separate and distinct pages, whereas chart.php?cols=name-loc-date and chart.php?cols=name-date are the same page merely with different view settings. Currently, Google may see the two chart URLs as separate pages when they should preferably be listed on Google as a single result and benefit from all incoming links to either URL. |
Scott Ullrich | 18 years ago # |
It would be nice if a web site operator could tell google "Crawl Now" to update a sites indexes, etc. I came across this need earlier when working with the Google Webmaster Tools. |
I'Been Jokin | 18 years ago # |
Breasts?? What happened to my BREASTS request???
|
Jacques LeBlanc | 18 years ago # |
I would like to know
1) From what country the visitor to my site originated
2) which of my files (mainly PDF) were downloaded and (very important to me) which user downloaded it...or what country downloaded it |
Victor Brodsky | 18 years ago # |
MeeboMe.com clone BUT with full statistics and logging of all messages, timestamps via simple view Gmail style threads. It would be nice if the code was released with GPL and worked on intranets independently from google servers... |
gabo_uy | 18 years ago # |
This feels like writing a letter to santa..., here I go..
Dear Santa G,
First of all I'd like GA and GWT integrated into a single environment. Then I'd like to be able to query GA like if I was using an OLAP client, and to be able to cross dimensions like: time, hits, sessions, geographic location, referring domain, visit length, main keyword PR.
With this I could create better, and more sophisticated queries.
I've been a good behaved webmaster, I haven't done any blackhat seo, so I hope you 'll visit me this year.
gabo_uy |
brettkan | 18 years ago # |
The option to make traffic and keyword data public without giving access to your account ... |
Bryce Roney | 18 years ago # |
I'd make it easier to combat Google Bombing, and remove your site from certain keywords. |
Sean McManus | 18 years ago # |
Blog spam ripping off legitimate content is a real problem. Being able to identify duplicate posts would be helpful, but having a way to report them for review would be even more useful. |
Nendoke | 18 years ago # |
Stream My PodCasts (RadioStations) Via Google , |
John Honeck | 18 years ago # |
An easy option:
My unique pages should be in the:
( ) Supplemental index (X) Normal Index |
Slippy | 18 years ago # |
I would like to see a breakdown of pagerank, perhaps a list of things about your site – and why it merits the pagerank that it has. I have a site that has a pagerank of 0 and I have relevant and big sites linking to me and reviewing my site, I have not used any black SEO tactics (and my site appears in the google listing) so I know I'm not banned or blocked, My site can be easily found in the google directory, and I have a very good traffic ranking according to Alexa, and yet I have no idea why I have a page rank of 0. So a tool that shows a breakdown would be great.
Also perhaps a tool that would update Google's link database more frequently, while many sites link to my site – they do not show up in Google when using a linkto:www.domain.com – but rather show up when using intext: "www.domain.com". I was told that it caused because Google only updates its link database infrequently, it's been more then 3-4 months and this still hasn't changed (and many of the sites are linking to me directly, without rel="nofollow" tags – they're reviewing and discussing my site afterall).
So it would be nice for some simple traffic and page ranking tools for the ordinary webmaster. I'm not a SEO guru, far from it, and it would be great to see Google embrace the little guy, tools simple enough for any webmaster. Thanks. |
Antony Di Scala | 18 years ago # |
I'd like to see an SSH client. |
Dave | 18 years ago # |
I have a whole bunch of error pages like www.site.com/members/name1.htm where the correct link is www.site.com/name1.htm but I've looked through the site and I can't find where they are, so where an error page is listed I would like to click on it and list the page(s) where the bad link is found in the first place |
Michael | 18 years ago # |
A tool to check all pages on a site for broken internal links. |
Kevin Green | 18 years ago # |
Just adding to the fire clearly, tell us where the 404's come from.. just listing the url doesn't help us find out how you got there :( |
Andrew Calvo | 18 years ago # |
I'd love to be able to set how often googlebot will crawl my page, and if its been a long time since the last crawl – why is that? Did i do something wrong on a page and Google is "punishing" me? |
Bill DAlessandro | 18 years ago # |
I'd love to see Sitemaps, Adsense, and Analytics all integrated into one interface.
It makes total sense (no pun intended) to be able to manage all these Google tools for Webmasters in the same place.
Please, I beg you! |
Matt Cutts | 18 years ago # |
Hey everybody, I just wanted to say thanks for the feedback. I'm going to send this around Google to make sure that plenty of people get a chance to read it. :) |
Logan Frederick | 18 years ago # |
-More features added to Google Finance. Possibly an API of sorts to include data about stocks on personal sites.
-Support for puts, calls, and short selling in the Portfolio manager |
Niko Kotiniemi | 18 years ago # |
Tool for locating what is assumed to be duplicate content by the spiders (e.g. I have listings of some products with only minor variations – I'm always wondering whether the small differences are noticed, they're important to customers so I'm not going to drop them).
Contextual analysis tools for which we now have to use 3rd party tools – such as lingual analysis (keyword density, different words, sentences etc) compared against the metainformation that we enter.
: Niko |
Steve Terjeson | 18 years ago # |
I would like to second Phillip on a couple things:
- ability to specify a country where the domain should reside (like the domain choice of www.mysite.com or mysite.com). I know we have issues with this as we host sites for international clients in the US and they want to be listed in the Canadian only searches for example.
- having a visual link structure to a site would be terrific. A way to show what pages and structure are currently being indexed by Google for a specific site.
I think I would be fifthing (who needs a fifth already this morning?) the idea to remove old pages from the index. |