Getting the Most Out of Google Without Spending a Dime

Its pretty common knowledge nowadays that achieving top SEO rankings can take a long time and become a real hassle. But, how do web designers expect to reach those coveted placements when seemingly everyone is fighting for them? There are certainly some ‘easy fixes’ to be found, but many will result in getting your site banned from Google and promoting bad general web practices. And, sure, you could shell out a ton of cash to search engine advertising programs– but they only work for the engine you submit to. In this article, we’ll take a look at the best ways to improve your site’s organic ranking on Google without breaking the bank… or your reputation.

note: This article will focus on SEO methods that produce the best organic rankings for Google, as that engine is generally regarded to use the most complex methods to rank its findings. If you’re looking to produce quick SEO turnaround for specific search terms and money isn’t a problem, you may want to look into Google’s advertising programs.


All Search engines grab a site’s readable content by utilizing small programs called web crawlers, also known as spiders or robots that randomly search the internet for information. To learn about how Google’s crawler works, check out Google basics. Each engines crawling methods vary slightly, but careful utilization of some key methods will help them all to index a site more accurately. Remember, top engines (and especially Google) look for site structure as much as content when indexing a page.

Your best bet is to write a readable, accessible page that doesn’t obscure content behind too much code and strategically places keywords in appropriate header tags. Obviously, if Google’s crawlers can’t read the content of a page, they won’t properly index it. And most of the time, anything linked from that page will suffer as well. This is the most important reason to make sure each page within a site, and not just a select few, is built for SEO.

Avoid tactics such as keyword stuffing, cloaking hidden text or using redirects to try and trick search engines. Google has built-in functionality that will easily spot these methods and ban your site. Instead, focus on infusing your site with relevant content, keywords and genuine inbound links. Not only do these practices help Google better index your site, they will also increase its overall style and content to provide a generally better browsing experience for your visitors.


While there is no clearly-defined set of steps a designer can take to automatically achieve top Google rankings, there are some tried-and-true methods for promoting your site on search engines. As a general practice, don’t implement a method you are either not familiar with or not sure of. As always, the safest way to build a successful site is to stick to what you know and do it well. That being said, let’s take a look at some easy ways to allow those spiders to better index your page.

Google’s spiders are exceptionally optimized for content that adheres to standards set by the Worldwide Web Consortium (W3C), especially the implementation of Cascading Style Sheets. CSS both plays extremely well with spiders and promotes excellent general web practices.

Coding with CSS offers less clutter for spiders to crawl before they find the actual content of a page. By eliminating bulky tags such as tables (that often run 3 or 4 deep) and replacing them with simple DIV tags, they are allowed to index pages faster and with more accuracy than the same page done in pure HTML. Keeping spiders happy means keeping them coming back. And the more they index your page, the better chance you have of getting found.

Within the code of the CSS page itself, it is imperative to utilize H1 and H2 tags to your advantage. One of the first things spiders look for is properly functioning headings. Naturally, only one H1 tag should be found on each page, containing nothing but a targeted phrase acting as a page headline. I know this seems like a simple step, but you’d be surprised at how many websites just don’t utilize this feature. If possible, the page should also incorporate a number of H2 tags, though any more than 4 or 5 may look like keyword spamming. But, as long as those keywords stay true to the overall content, you’ll be fine.

Of course, some of that good old fashioned HTML between thetags is still necessary for good indexing. Always make sure your TITLE tag includes some sort of keyword or descriptive phrase of the page content. Ideally, each page should have its own TITLE tag, but I’ve seen generic ones work just as well. It’s really all about what you put up there… a page entitled ‘dog food’ will never be indexed as well as ‘healthy, lean food for active dogs.’ Better yet would be something like ‘healthy, lean food for active dogs – product information guide.’ Try to be as specific as you can while sticking to the main point.

Regarding the content of your page itself, always think ‘keyword placement.’ Thinking it yet? Great… make it your mantra. But, really, optimized keyword placement is possibly the trickiest part of getting a page indexed well for Google. There seems to be a fine line between choosing the right keywords and utilizing them to your advantage, and choosing the right keywords and using them too much. Where the first will almost always guarantee higher page rankings, the latter may just get you pinned as a spammer. There are many excellent resources to be found on the web considering this point, so get your Google on!

Another thing to strongly consider is that spiders cannot read all types of content on a page. And what they can’t read, they can’t index. Specifically, JavaScript and Flash tend to give Google the hardest time, though some recent developments have been made. Even so, nothing can be indexed as purely, quickly and intelligently as pure text on a site. I’ve seen JavaScript behave both well and negatively for spiders, and, frankly, I’m still a bit skeptical when including it in my projects. And I (along with the majority of web designers) still don’t trust spiders with my all-flash content. But sometimes, you have to develop that special content for a site and you’re still expected to optimize it for search engines. What then? Well, the best way is to develop a non-flash version of the site, too. Not only does this allow spiders to index the site, it is becoming increasingly necessary with the exponential rise in mobile web-optimized devices (yes, I’m thinking the iPhone). Since the device doesn’t support flash anyways, it is always a good idea to develop for the widest audience… including those with limited accessibility.

Even with mostly JavaScript or Flash content, there are still ways of getting your site indexed. Despite being so common, sitemaps have proven over and over again to be one of the most efficient methods to attract spiders.

An obvious, but often overlooked element to SEO is simply making sure that all links on a site can be read and properly indexed. Remember, Google’s spiders can’t read any content that is embedded with the use of Javascript or Flash. Always make sure your relevant links can be read, one way or another.

The best way to keep your links highly visible is to code them cleanly and near the top of a page. But that’s not even enough to keep those spiders from getting confused. Be compulsive in making sure the portion of the link that can be indexed includes relevant text describing what it links to. A link that reads ‘click to view my portfolio’ will certainly get indexed better than one that simply reads ‘click here.’ And, per our mantra, include as many keywords as possible.

I’ve said it time and time again… and any designer worried about SEO will agree. A well-designed, simply-coded page with relevant keyword content is the easiest way to achieve those high organic rankings in Google. But, it does take some time for spiders to crawl and index a site. As long as you are patient and do your best to adhere to the latest web standards, you will most definately improve your site’s viewability in the major search engines.

Dave focuses in WordPress UI Design and WordPress Plugin Development (some have labelled him a fanatic) but has hands in many other projects, including photography and music. He's never far from a computer... or an iPad (for testing purposes, of course). Don't even try to bother him during Premier League games, though... he's cheering way too loudly for a team destined to play second fiddle. Dave currently teaches courses in code and philosophy at California State University, Long Beach (well... from home for the foreseeable future) alongside his small business at But, every now and then, he steps outside to protest or give the dog a car ride.

Let me know what you're thinking

© 2008-2020,