Duplicate Content Prevention: WWW vs. Non-WWW and .Htaccess

Many website owners have a technical SEO issue with their site(s) that they don’t even realize, and when left uncorrected, can really hurt search engine rankings. This situation occurs when the one or more pages can be accessed via several different URL’s. Read more on Duplicate Content Prevention: WWW vs. Non-WWW and .Htaccess…

Read more »

Improve Site Security and SEO with One Line of Code

I was recently doing research in Google for a new WordPress plugin we are developing. I was greeted with page after page of results that read like this:

Google Listings that display directory contents.

The Google results show that many sites have their directory contents being listed, and ranked. This tells me that many, many site owners are using default server settings and unwisely revealing the contents of their directories. It is extremely important to hide your directory contents for two reasons: Security and SEO. Read more on Improve Site Security and SEO with One Line of Code…

Read more »

Website Security: Hackers, Botnets, and LIBWWW-PERL: Part 3

Ok, by now, you’ve read Part 1 and Part 2.

Let’s move on.

The Solution: A Few Lines of .Htaccess Code

There is a quick solution that most website owners shouldn’t have any problem implementing.

If the following is not already in your .htaccess file, then insert it near the beginning: Read more on Website Security: Hackers, Botnets, and LIBWWW-PERL: Part 3…

Read more »

Website Security: Hackers, Botnets, and LIBWWW-PERL: Part 2

Ok, by now, you’ve read Part 1 so let’s continue.

NOTES ON SECURITY:
Security is about reducing risk, and lowering the statistical probability of a successful attack. You can never eliminate risk fully, and there is no such thing as 100% impenetrable security, even with the best measures in place. By increasing the the level of security for your site or application, you are shrinking the pool of hackers that have the [skill|experience|time|resources|desire] to hack your site. In most criminal acts, it’s about following the path of least resistance — if you increase the difficulty of success (sometimes by even a small margin) then often the hacker will go somewhere else.

Think of other crimes like car theft or breaking into a house. In most cases, if a thief is checking out your car, but discovers that you have a vehicle with all the top security measures, he’ll move on to an easier one. That is, unless he has a specific reason to target your car. There are very purposeful and targeted crimes, but these are much less common than the crimes of least resistance. When hackers break into banking or large corporate web sites, they have a specific target, and incredible amounts of skill and resources. Compared to typical website hacks, the overall percentage of attacks like this is extremely low, because there aren’t many out there who could carry it out. Read more on Website Security: Hackers, Botnets, and LIBWWW-PERL: Part 2…

Read more »

Website Security: Hackers, Botnets, and LIBWWW-PERL: Part 1

Take proper security measures to protect your website.Recently, there has been a rash of automated hacker attacks, defacing websites across the globe that don’t employ adequate security measures. Earlier this week, several friends of mine had their sites hacked and defaced. Most of these attacks don’t come from experienced hackers — they come from script kiddies employing automated scripts and a network of compromised computers (botnets). Even though these junior hackers may be inexperienced, they know enough to take down your site, and I don’t need to explain how much that can cost your business in lost revenue.

Don’t worry though, there is a simple solution that will reduce your site’s susceptibility to these attacks, and buy you some time to plug security holes. It’s relatively easy to implement, even if you’re not a security expert. Read more on Website Security: Hackers, Botnets, and LIBWWW-PERL: Part 1…

Read more »

Detect User-Agents: Cloak and Dagger for Web Sites – Part 2

“I’ve heard of User-Agents…”

In a previous post, I introduced you to User-Agents. Now let’s find out why you need to detect them, and how. Read more on Detect User-Agents: Cloak and Dagger for Web Sites – Part 2…

Read more »

.htaccess Reference: Part 2

In case you missed it, you can read the first part of the .htaccess Reference here.

Password unprotection

Unprotect a directory inside an otherwise protected structure:

Satisfy any

Extra secure method to force a domain to only use SSL and fix double login problem

If you really want to be sure that your server is only serving documents over an encrypted SSL channel (you wouldn’t want visitors to submit an .htaccess password prompt on an unencrypted connection) then you need to use the SSLRequireSSL directive with the +StrictRequire Option turned on. Read more on .htaccess Reference: Part 2…

Read more »

.Htaccess Reference

.htaccess is a configuration file used on Apache and other nix servers. It is one of the most configurable and powerful tools for website functionality, security, and search engine optimization. Here is a comprehensive reference.

.Htaccess
(Hypertext Access) is the default name of Apache’s directory-level configuration file. It provides the ability to customize configuration directives defined in the main configuration file. The configuration directives need to be in .htaccess context and the user needs appropriate permissions. Read more on .Htaccess Reference…

Read more »

.Htaccess IP Banning – Block Bad Visitors

Increase your web site’s security by blocking bad visitors with .htaccess. If you have nuisance visitors, site scrapers, or spammers, you may want to add some lines of code to your .htaccess file that will block bad visitors by IP address or by blocks of IP addresses. Read more on .Htaccess IP Banning – Block Bad Visitors…

Read more »

URL Rewriting – Search Engine Friendly URL’s – Part 2

You probably know by now that dynamic web sites have a challenge in search engine optimization because they often use dynamic url’s with information carried from page to page in query strings. (http://www.yourdomain.com/index.php?id1=value1&id2=value2&id3=value3)

Through proper use of .htaccess and mod_rewrite, you can turn your ugly dynamic url’s into search engine friendly (and user friendly) url’s. Read more on URL Rewriting – Search Engine Friendly URL’s – Part 2…

Read more »

Page 1 of 212