Evidence-Based Search Engine Optimization

Over the last several years, our industry has produced myriad theories of how firms or consultants should go about SEO, each with their own set of shortcomings. Through these flawed systems, a new overarching theory of how our industry should behave has budded. First, let’s identify some of those flawed systems. Ethical SEO: A do no harm type of solution that strictly adheres to the guidelines of major search engines. Greatly inhibits the ability of a site to perform in the most competitive industries Follows a subjective set of ideals (whose ethics is right? mine is) Forces SEO’s to react drastically to position changes (such as Google coming out against reciprocal links or paid links) Performance-Based SEO: An elusive goal which attempts to create a...

Cross Site Request Forgery in Sphinn

I have removed the XSRF exploit, although you can click on the link below with the text “this story” to cause a vote to happen. Just imagine putting that into an iframe or an img src=, and it would accomplish the same thing w/o you knowing…. Thanks for the vote! If you are currently logged into Sphinn (or simply forgot to logout), chances are, you have just voted for this story. Sphinn’s vulnerability is one of the most common forms of XSRF, where the site allows actions to originate offsite without any authentication aside from the original cookie / session. There are multiple ways to prevent XSRF, the easiest of which is to generate a user-specific token for each action origination point on the site (a form, a link that votes, etc.) so...

Stumbleupon Categorizes MSN Live Search as Porn

Sorry, this just made me laugh today. After following David Naylor’s post on MSN’s new rockin’ search interface, StumbleUpon pointed out to me that the site I was visiting (a search on airline tickets) was pornography. Since the site had yet to be reviewed, I was a little surprised that this happened. It appears that it was not a hit job by some anti-microsoft stumblers. So what gives? No tags for this post.

Readability of Web 2.0 Content

Readability. Authors, marketers and webmasters know that it matters, but as communities form up around niches of similar interest, the requirements of “write like your audience is in 3rd grade” have loosened greatly. I recently came upon a piece of software called “Text Master Pro“. The program analyzes the complexity of vocabulary used in content to determine a UV (readability) score. Despite my usual refusal to use any software with the word “pro” in the title, I fired it up and ran it on the sites listed on 8 different popular web 2.0 sites. I continued to do this with their individual feeds over the course of the week to get an average readability of the content posted to their sites. Here are the results. I also include...

SEOMoz Quiz Feeds My Ego

SEO Dark Lord – 100% Are you an SEO Expert? I must say that every time one of these quizzes comes out, I have to go take it. It is an addiction which I am currently unable to crack. The last test I remember had more of a focus on industry news than SEO-specific knowledge, so I was able to perform a little better this go around. Go ahead and take a crack at it! No tags for this...

Exclude-by-Keyword: Thoughts on Spam and Robots.txt

Note: This solution is for spam that cannot be filtered. There are already wonderful tools to help with comment / forum / wikispam such as LinkSleeve and Akismet. However, this proposed method would prevent the more nefarious methods such as HTML Injection, XSS, and Parasitic Hosting techniques. Truth be told, I rarely use the Robots.txt file. It’s functionalities can be largely replicated on a page-by-page basis via the robots META tag and, frankly, we spend a lot more time on getting page into the SERPs than excluding them. However, after running / creating several large communities with tons of user-generated content, I realized that the Robots.txt file could offer a lot more powerful tools for exclusion. Essentially, exclude-by-keyword. The truth is,...