Welcome to the Site Bulletin!
Dear eBay Community:
In our last technical update in September, we discussed our continued commitment to site stability. We announced that a warm backup was in place and that our next major step was implementing a high-availability backup system, sometimes referred to as a "hot backup."
Hot Backup Versus Warm Backup
Before updating you on the hot backup status, we'd like to define the terms "warm backup" and "hot backup," so that we're all on the same page.
A "warm backup" allows eBay to recover from most major hardware or software failures in its database within a timeframe ranging from 20 minutes to two hours. Examples of "major hardware failures" are CPU or system board failures that cause the operating system to reboot itself. An example of a "major software failure" is the operating system crashing and rebooting itself.
In many cases, it is faster to let the system reboot itself than to move to the warm backup. A reboot takes approximately 45 minutes, whereas going to a warm backup could take longer. This is because the warm backup system is a complete duplicate of the eBay system (CPUs and all), and moving to the warm backup requires that we move a small number of database updates to the backup before bringing the system back up. Depending on how many updates we need to move and apply, this process can take anywhere from 20 minutes to 2 hours.
A "hot backup" means that eBay can recover from most major hardware and software failures in a timeframe ranging from 5 minutes to 1 hour. A "hot backup" can recover faster since it is TWO systems SHARING all the disks and the data that support eBay. When recovery is necessary, no data has to move the backup system just comes up and picks up where the failing system left off.
We say that this can take "up to an hour" because we need to allow time for determining why the system failed and to make sure that using the hot backup is the right thing to do.
There are certain obscure failures in which recovery using the hot backup is NOT the right course of action, and initially, we want to be very conservative in this new environment. As we gain experience with the hot backup environment, recovery time will shorten.
To support a hot backup, we needed to install new software from Sun and Veritas. It's safer to build a new system rather than layering it on top of the existing system, so we completely rebuilt eBay's database server environment on two new Starfire machines, including a new set of disks and software, in a new location.
Over the past few months, we have been building this new system in close cooperation with Sun, Oracle, and Veritas. In fact, we have already completed a dress rehearsal of the switchover to the new system, and at least two more are planned before the final move to the new system.
Many of you may wonder why this has taken so long. When you consider that it involved the complete installation of TWO new Sun Starfires, with all of the disks and software involved, I'm sure you'll see that it's not a simple task. In addition, we are developing new operations and administration procedures to run the new system and make it as reliable as possible for our customers.
Although this is a big change for our system, it won't be a big change for our members. We're changing the database back end, but NOT the applications that run eBay. From a member's perspective, the eBay experience should be identical to what you see today. In order to ensure this, we will institute a "feature freeze" for the week before the switchover to ensure that we don't confuse feature changes with hot backup configuration changes.
The switchover is currently planned for early November. We'll keep you updated if this changes.
Many members have asked us about eBay's performance, and we wanted to address some of those questions.
Like many other companies, eBay uses Keynote's services to measure and analyze our web-site performance. Below, you'll see the graph of a typical day's response time from KeyNote for eBay.
As you can see, the "View Item" function's response time (denoted by the purple line) is typically 2 seconds or less, averaged over the continental United States. The other lines represent the functions of Viewing Bids(the green line), Viewing Feedback (the dark blue line) and Search (the light, aqua blue line).
Our biggest challenge is Search. As quickly as we add capacity for Search, it gets used up. Currently, eBay members conduct over 10 MILLION searches a day. We're adding capacity as quickly as we can, but demand sometimes gets ahead of us, and response time exceeds what we'd all like to see, especially during prime time.
In the next few weeks, we're introducing major internal changes to Search that should improve its internal efficiency. This means we'll be able to do more with less hardware, which should help us keep search response times up to par.
Many members have asked if new features like Great Collections affect site performance. Great Collections, AOL, and International all run on separate sets of servers to prevent them from having an impact on site performance. (They all share a database server with the core eBay site, but they currently generate a fraction of the load that the core eBay site does.) This also keeps problems in these sites from affecting the core eBay site.
As you know, we are working to keep our members better informed about new features and functionality that are coming to eBay. Our first Upcoming Events Update was last week, and we hope it was helpful.
It's important to realize that the introduction of new features (we call them "trains") can sometimes come in ahead of or behind schedule depending on Quality Assurance (otherwise known as Testing) and Development cycles within eBay. We cannot quote exact implementation dates, and new priorities or unexpected problems may delay changes you're looking forward to. In addition, there are some changes that we cannot announce in advance due to legal, competitive or policy reasons.
It's also important to realize that not ALL changes are visible to you. In October, for example, we're introducing many new internal "headroom" features that will improve the performance of eBay or make certain functions more efficient. You won't be able to tell that there was a change by looking at the site, but we can see changes in the internal performance of the functions, which gives us better performance and more room to grow.
I won't cover the upcoming features here, since they will be covered in other regular updates.
Who works on what in eBay Technologies?
eBay Technologies is made up of several teams, each of which does something different to support eBay. These are two teams you'll hear about a lot.
Site Operations runs the site and maintains the environment that supports the eBay system. This includes installing and maintaining the various machines, operating systems, routers, security software, etc. that are required to run eBay. Site Operations is the team working on the Hot Backup described above.
Product Development codes and tests the features on the eBay site. Features such as International, My eBay, Great Collections, and Autos all come out of the Product Development group. Product Development also works with the Architecture team on performance enhancements which work "behind the scenes" to help eBay keep up with its growth.
Fortunately, the people who work on new features the Product Development team are NOT the same people who work on things like Hot Backup or future architecture. In fact, the people in these groups have completely different skill sets. So, if you see new features like Great Collections rolling out, this does NOT mean they're done at the expense of infrastructure improvements. They are truly parallel activities. Site stability remains our top priority, and we will not compromise site stability for new features.
We hope you find this technical update useful. We will continue to provide monthly updates on the technical status of the site, to help you understand and feel comfortable with what eBay is doing to keep up with you and to serve you better. We value and appreciate your business, and we look forward to a great future of working together.