PDA

View Full Version : Google Caffeine



phanio
12-07-2009, 11:34 AM
Has any one heard about this or used this during the trial period? Does this mean big changes in google's rank or are they just focusing on the key aspects they were before.

It seems that they are now going to focus more on things like site speed, site design and number of social bookmarks.

Am I missing something here?

vangogh
12-07-2009, 01:11 PM
I checked the beta briefly when Google first announced it, but didn't really look at it too much. From what I understand caffeine is a rework of how Google will index pages. It's a redo of the index itself. Most people who have reported on the results they've seen in caffeine and compared to the regular results haven't reported seeing much difference across most queries.

Google is rolling out the caffeine update slowly so as not to upset things during the holiday shopping season, but expect that once we hit 2010 caffeine will be rolled out more quickly.

Given that Google is talking about things like site speed, etc. I would think caffeine is preparing the index for these future changes and additions to their algorithms. We may not see a lot of difference with the caffeine update itself (assuming the reports I've seen are true), but we may start seeing some changes after the update as Google makes more changes.

In general as long as you're building sites for real people that add real value to the web you shouldn't have to worry about any change the search engines make. If you're search strategy has been to take advantage of wholes in the algorithm or relies heavily on one or two things then you could be in trouble if an update closes those wholes or reduces the weight of those one or two things in the ranking algorithms.

Most of us will probably never notice a difference when Google pushes the caffeine update through, even though there will be differences in the results we see. Ideally we'll see better results, but odds are we won't notice anything drastic happening.

At the moment I don't know that anyone can tell you what will change since nothing has really changed yet. Once the changes are out you can expect to see a lot of people discussing what might have changed with the usual mix of good and bad information and realistic thoughts mixed with paranoia and complaints. It's what usually happens.

phanio
12-09-2009, 02:45 PM
Thanks - I am just concerned about some things. I am not a web programmer - but, I am trying to learn. Most of my learning is trail and error - which may not result in good site design or site speed. I just try things until they work for me - whether or not they are proper and correct.

This is what is worrying me. Will google look at me negatively if my site is not designed to the best standards? I try to design in ways that are user friendly and can get the user to the best information as quickly as poosible.

To that note, what would you or anyone recommend as the BEST web design book out there today?

vangogh
12-09-2009, 09:46 PM
The look of the site shouldn't have any affect on ranking. Spiders and robots don't ever see you page visually.

How those pages are coded does play a part. On the negative side there are some show stoppers that can really cause problems. For example using Flash for your navigation. Search spiders won't be able to read the Flash and so will get stuck crawling your site. Odds are you're not making these kind of errors.

Most things though will be less a matter of being penalized as they will a matter of not gaining some benefit. Take site speed. I doubt the end result will be that Google will stop showing any page that takes more than x seconds to load. The more likely scenario is something like:

You and I both have sites and we each have a page that ranks well for a particular query. Say your page ranks #2 and mine ranks #4 and some other pages are #1 and #3. Your page loads very slowly. Mine loads very fast. Maybe you then drop down to $4 and I move up to #2.

Keep in mine there are hundreds of different factors that affect where your page ranks. Site speed would be one of those hundreds of factors. It's something you'd want to do right because it's in there somewhere, but not doing it to perfection probably only has a small effect in and of itself.

Try to learn web standards. They really aren't all that difficult to learn. One great place to learn is w3schools (http://www.w3schools.com/). They have simple tutorials on most everything when it comes to building a website. You won't become an expert reading them, but you will get a great foundation to build on.

It's hard for me to recommend a single book on web design in general since it's been years since I read one and things have changed, but also because I went through a lot of more specific books learning what I have. Also a lot of trial and error.

Is there a specific topic you'd like a recommendation for? I can probably come up with specific books about html or css, etc.

billbenson
12-09-2009, 11:42 PM
You have to start somewhere, so don't get frustrated.

One approach could be to get a copy of DreamWeaver. I believe the current versions integrate CSS pretty well. I'm not sure as I haven't used it in years. Once you make a page, go into the code and see what it did and try to figure out why. Then plop the code into a w3schools code validator. It will tell you where you have errors which will help you to see what the proper code should look like as well.

If you do this, be aware that DW doesn't necessarily produce the best code in the world. But its a good starting point as long as you also analyze and fix the code.

Either that or just do tutorial after tutorial after tutorial after... But always validate your css and html.

vangogh
12-10-2009, 01:03 AM
I wouldn't start with DreamWeaver. Start with something simple like Notepad or the equivalent and type some very simple html pages. Then build on those simple pages.

Joseph I know you're past that point already. Didn't you develop your current site?

One book you might want to read is Designing with Web Standards (http://www.zeldman.com/dwws/). Jeffrey Zeldman, the author is one of those who pushed for standards years ago and this book is considered a classic in web design. I've never actually read it myself, but I did add it to my Amazon wishlist recently. I thought it would be good to have a refresher and I'm sure there will be a few things I can learn from it.

phanio
12-11-2009, 12:36 PM
Thanks - yes, I have designed and programmed my site myself. Mostly trail and error. I just am not sure if my site is designed to the standards google and the others are looking for. I don't use flash and try to make my site very user friendly - I guess I just feel I may be missing something. Example, I try to have every internal page linked to my home page. While it may seem a little messy - I feel that this will help users - one click to get the informaiton you need instead of several drill down clicks. But, the other day, I was told that I should not have more than 25 internal links on my home page. That the site should be set up like a tree with large branches, branching off to smaller branches.

If I changed my site to match the 25 internal links - it changes my design (which I struggled to get as it is) but also changes how I think users will interact with my site - most users want information quickly - I know that if I have to click more than a few links to get where I want to be, I leave the site.

So, I am trying to find out what is right and how to design my site better. This also includes some of the programming. I know that I can use CSS to set spaces between images or text. But, I also try to keep my CSS file as small as possible. So, there are times that I simply use <br /> in my code to set space or I might even use an empty list item to create space. But, while it works for me and does not take much to load (regarding load speed) is this right or will I be penialized for having a few breaks or empty list items?

I will check out the book you reference. Again, thanks!

vangogh
12-11-2009, 01:43 PM
It's all about learning and continuing to learn. It never ends really. The way I learned to develop sites a few years ago differs from how I develop them now. You are where you are in where you know. You learn more and learn to do what you know a little better and it continues.

You shouldn't need to use <br /> to create a space. It's always going to be better to set that spacing in your css. You won't be penalized for it with search engines though. It's just going to be easier to maintain if it's in your css.

phanio
12-13-2009, 03:52 PM
Thanks - I have a question about internal links. I have seen some site that use the full url in the internal link - i.e. http://www.yoursite.com/admin/pages.html - and I have seen some sites use truncated links - i.e. ../admin/pages.html.

Do you know which is better to use? I know that spiders (when indexing) can place the main site url (http:www.yoursite.com) in front of the truncated piece. But, when I was looking at my site in one other these "..show you what a spider sees" site, it was reforming my urls as http://www.yoursite.com/../admin/pages.html - which was of course an invalid link.

I would hate to have a spider index an invalid link or information. Do you know what is better to use?

vangogh
12-14-2009, 10:55 AM
Spiders should be fine with relative links. The reason people use relative links is because they are sometimes easier to maintain. Say you move the entire site to a new domain for some reason. If all your links are relative and you kept the same folder structure your links still work.

I prefer using absolute URLs (the full http:// on) for a few reasons.

1. It leaves no doubt what the URL is. Spiders should be able to figure out the relative URLs, but why make them. Also if you haven't set up canonical redirection (non www or www) then someone could enter your site as domain.com instead of www. domain.com and all the URLs would be missing the www since the links are relative to the domain.

2. If your content gets copied and it will if you're online long enough and gain any visibility then absolute URLs still point back to your domain. Assuming you link internally in each of your pages (and you should) then if someone steals your content it still points back to you. That gives you some cheap low quality links and also helps the search engines know which page was the original. It also prevents someone from mirroring your entire site unless they want to change all the links, which they generally don't want to do.

3. If you publish a feed or your content might be absorbed in any way other than directly visiting your site (email perhaps), then you need your links to be absolute since a relative link in my feedreader is relative to my local file system and not your site.

Years ago the general advice was to use relative URLs or root relative URLs (meaning you start from / which is the same as saying doming.com), but there are some advantages to using absolute URLS as mentioned above and the advantages of relative URLs (moving a site to a new domain) aren't as practical as they first seem.

phanio
12-15-2009, 03:06 PM
Thanks that makes a lot of sense. I did go to your site to see what you use. Now the hard work - making all the urls absolute.