Hacker News new | past | comments | ask | show | jobs | submit login
Successful Site in 12 Months with Google Alone (webmasterworld.com)
66 points by buluzhai on May 14, 2009 | hide | past | favorite | 24 comments



Pretty basic stuff. None of it newsworthy. I notice that right now it's #8, but in the classic view it's not even on the home page. This might be one indication of lower quality on the site.


Sorry, but you need to remember that some hackers (eg me) are pretty new to this stuff. I found it helpful to have all this information in one place. For me, the ranking is justified.


38 minutes later they are showing it at 3 (on /classic) and 4 on the home page.


This is precisely the problem with /classic -- even old (classic?) users use the main home page, so they're going to vote up stuff they see there. The front page is so path-dependent that taking out certain votes after-the-fact is never going to make big changes.


Maybe 'classic' users should be shown the classic leaderboard by default?

Alternatively, maybe classic users have the same preferences, but check the site less frequently... so they follow the newer population but at a lag.


I'm not proud of it but I did some SEO 'stuff' for a local company (gas money) and there really was only so much I could do; they were using a flash as their navigation that targeted an iframe to generate it's content pages, if you think that isn't bad enough, I had no access to any of the *.fla files to do what I needed to. I even tried working around this with JavaScript, but it ended up not being completely compliant in older versions of firefox. . .so what did I do?

I said fuck it, and rolled back their index page back to HTML4 transitional and, low and behold, it actually worked. In fact, it worked so well that I've grown to become suspicious of XHTML.


The pendulum has finally swung away from XHTML; developers are starting to understand that HTML is not inferior to XHTML. The HTML5 unification process -- such that XHTML and HTML are actually just syntactically different -- should help kill the XHTML-is-better meme for good.


roger wilco


The article has some good points, even if some are rather outdated. Published in 2002!


Exactly. I'm sure this was really useful at the time, but SEO and the web as whole has changed drastically since 2002. It's gotten a lot more complicated and competitive.


That is like saying the 100m for men has changed. It hasn't, you just have to be better at it than 40 years ago.

Nothing has really changed: build something people want, using terminology they use, and get third party citations.

Job done.


"Remember, 80% of your surfers will be at 56k or even less."

Yeah.


What is wrong with that assumption? I assume a lot of people are starting to browse from cellphones, mobile connections, more foreign places. You wouldn't want stuff to bog down your site, just because there are people who have faster connections.


Not 80% of people.


That's it, really. The rest of this is good stuff. Especially considering it's 7 years old.


Well, I'd also add that adding a blog to a site can accomplish some of what he recommended (like adding a page per day).

Also, using mod_rewrite and clever routing can accomplish things like folder names with keywords.


The core methodology is good. But:

- 5-12k may not be as useful as a guideline as before.

- HTML 4 & JavaScript recommendations may no longer be as valid as they were then.


This post was the SEO bible back in 2003, and a lot of folks used it to go back to basics following the infamous Google "Florida" update. I think it's held up pretty well.

In addition to WMW, Brett runs a pretty good SEO conference called Pubcon.


"If you have the budget, then submit to Looksmart and Yahoo." "Submit the root to: Google, Fast, Altavista, WiseNut, (write Teoma), DirectHit, and Hotbot."

This article is more than a little past its prime.


Anyone have recent experience in whether search keywords in the domain name itself matter?

Separately, what about subdomains?


It's a myth with keywords in the domain name. You can use keyword subdomains and it worked just fine. My rule is that you should have at least one keyword in the URL. And the URL should not be longer than 15 words (incl. domain, subdomain, path)


Yes, I am very interested in this exact same question.


I find the keyword in the domain does help, but it really isn't worth chasing to the point of having a stupid name.. What you really want is an "exact phrase match"

I.e. "On sale laptops" gets searched for 19.9k times a month https://adwords.google.com/select/KeywordToolExternal

and position 1 is likely to get nearly 40% (if not more) of those clicks http://www.jaygeiger.com/index.php/2008/11/05/percentage-cli...

Thus if you can own OnSaleLaptops.com you stand to get around 10-30k hits/mo surely you can monetize that :/

hope these answers were ok, they are just my experience.


oldddddddd




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: