Slashdot Effect Prevention: Still Neccessary?
Author: Wayne Eggert
Chances are, if you're a web host or have ever had an interest in hosting a website, you've heard of the Slashdot Effect. It is described as a condition in which a web server is overloaded due to an article or review being published at another web site. The term obviously comes from Slashdot.org, which is the #1 site for geek news on technology subjects. Many people also refer to the Slashdot Effect as meaning a website that has been posted on Slashdot.org (ie. "Mydomain.com was Slashdotted on 3/5/05"). For the purpose of this article, we will consider the Slashdot Effect as meaning a Denial of Service (DoS) caused by an extreme amount of traffic due to an article being posted on Slashdot.org.
The Slashdot Effect has generally affected smaller websites who are ill-prepared for a large amount of traffic. Sites hosting video or large images are likely to get hit harder than sites with text content & sparse images. We know that during the Slashdot Effect, a site experiences an unprecedented amount of page requests and depending upon the internet connection & server hardware, it may not be able to process all of the requests in a timely fashion.
One possible bottleneck that could cause the Slashdot Effect is inadequate server hardware. If the computer which is hosting the website does not have a fast processor, lacks physical RAM, has a slow hard drive, or any combination thereof, it may not be able to perform properly.
A second potential bottleneck is the server's internet connection. Hosting a site on a cable modem and expecting to serve 100 or 500 users per minute is an unrealistic expectation, whereas servering that number of users on a server connected to multiple T1 internet backbones is much more feasible.
Apache or web server client limits might also cause problems for sites receiving tons of traffic from Slashdot. Web server software usually allows the server administrator to set a limit on the maximum number of clients that can connect to the web server at one time. This allows for control over the number of processes being spawned and the amount of memory/resources that are allocated to the web server. Without this control, during times of very high traffic, a web server could potentially wreak havoc on other services being offered on the server, such as mail, ftp, or DNS.
Poorly written code, unoptimized databases, or frequest hard drive access might also influence how well a server can perform. Often with sites built upon databases, programmers do not consider page load time due to calculations within the code. If a script spends 3-5 seconds to churn out its web page because it's calculating or organizing results from a database, it's going to have a major impact when there are 100 users per minute accessing it. A perl script or PHP script that writes data to the hard drive, reads from the hard drive, or causes a process to execute on the drive is going to be bad news if there are an unplanned number of users accessing it.
How can the Slashdot Effect be prevented? Well, that's a tough question -- the answer depends on how much money you have. If you're a large company with an unlimited amount of funds, investing in really nice hardware (ie. load-balanced servers with fast processors & gigs of memory) & fat data pipes would likely do the trick. For those of us who don't have that kinda cash and have to make due with shared webhosting or limited hardware, the answer is quite different.
First and foremost, not every site is going to be a victim of the Slashdot Effect, Anandtech Effect, or any other "Effect". Even if your site is small and has the potential to be crippled if a large number of users were to visit, it doesn't mean that it's going to happen. There's no need to run out and buy $30,000 worth of equipment to protect against something that might never occur, right? Baby steps, baby steps.
- Optimize your code or have your programmers optimize it (that's what they exist for, right?). You might not even have to spend any money here if it's a DIY (Do It Yourself) operation.
- If there are pages that are taking 5-10 seconds to load, see if there's a way to re-organize data or to perform the operations being executed less often.
- Optimize your database tables, specifically indexes on large tables that are being joined with other large tables.
- Set realistic limits on the number of maximum clients in your web server software, depending upon the processor speed and amount of memory in your server.
- Design your site with a conservative number of images -- your visitors will thank you.
- Identify parts of your website that have large amounts of high-resolution images (ie. photo galleries), videos, or large downloads. Determine if these sections should be monitored and/or incorporate scripts to limit concurrent downloads.
All of the steps listed above are simple ways to ensure you're optimizing your website to the best of your ability before adding new hardware or software solutions into the mix. Even if you do determine that you'll need to add more hardware, working through some of these steps will only prove beneficial in the long-run.
Slashdot.org has grown its user-base drastically in the last 5 years, though this doesn't mean the Slashdot Effect has grown that much more powerful. In fact, with more technology-related sites and blogs being created each day, the effect of a Slashdotting is actually less today than previous years. Most webmasters would likely welcome a mention on a large technology portal, as long as it meant they would not be causing any downtime for their users or other services on their server. Small sites, however, may well still fall victim to the Slashdot Effect and any incurring bandwidth charges or outages the "Effect" causes. For these sites, the only prevention is playing it smart & optimizing/analyzing or blocking Slashdot.org or similar high-traffic sites using .htaccess in hopes of preventing this high-traffic phenomenon.
No comments have yet been made.