Does Google index RSS Feeds?
I have website TheBossReportCard.com (that I just redesigned due to Fluther suggestions) and would like to get the user comments indexed by the search engines. Since the comments are stored on a database, versus webpages, the search engine robots can’t see them, therefore can’t index them.
I am wondering if I publish all the comments in RSS feeds with links to each, will search engines be able to index them?
Observing members:
0
Composing members:
0
5 Answers
When Google (or any search engine I know of) indexes your site, the spider they use will call the page almost like a user’s browser would. The page will be generated on the server the same so unless you’re running something fairly unconventional for your comments they will be indexed as content.
That’s why you see people commenting with something completely generic or off topic with a link back to the site of their choice, they’re hoping for a little link love from Google.
Since an RSS feed is generated to show the same content that would be available anyway I don’t know why Google would index them but honestly I don’t know if they do or not.
I’m not sure if they get indexed, but they won’t show up in results.
You say “the comments are stored on a database, versus webpages”… Does that mean you’re not displaying the comments on a page at all?
If you’re pulling the results out of the database and populating a page with them, google will see the rendered page with database results, just as any user using a normal browser would.
If the comments are just left in the database and not used on the site, even if you could get them indexed, it’d be worthless for a search because they’d have no page to link through to.
@funkdaddy and @damien,
The rss 2.0 feed I generate is in XML. Each item has a bosses’ name, position and company as the title element, a link with a database key, and a description which is the comments left by a user grading the boss. (see the results at http://feeds.feedburner.com/TheBossReportCard) Since I will be updating the feeds every so often, I’m hoping that the entries will live for a while in google.
Maybe somebody can suggest a method to have database entries indexed by google, such as creating a reference file in my home directory that google can crawl, but I imagine that could be become too large to crawl. I’m thinking of entries in it like:
<a href“http://my.site.com?id=1234546>Text I want crawled</a>
<a href“http://my.site.com?id=1234547>Text I want crawled</a>
<a href“http://my.site.com?id=1234548>Text I want crawled</a>
You can use sitemaps to tell google where your content is if it’s difficult to reach through normal crawling techniques. The links in the sitemaps should link to pages with content which can be indexed.
Directly indexing content from a database, apart from security issues, would degrade the quality of a search engine and ultimately make the results unreliable. If what you’re suggesting worked, you’d be able to create databases full of keywords for google to index (much like the way meta tags used to work), then you’d be able to use all sorts of crap to pull people to your site even if the content’s not related to their search.
If you want content to be indexed, it needs to be visible to the public, not hidden in a database.
Response moderated (Spam)
Answer this question
This question is in the General Section. Responses must be helpful and on-topic.