Demolition & Data Centers
May. 4th, 2018 01:02 pmI started working for Deem, Inc. (formerly Rearden Commerce) in December, 2008. At that time, I had passed all the interviews, left my job at Fannie Mae and was taking a break of a few weeks before my official start date of Jan 5, 2009, but there were four guys moving equipment from a Savvis data center near Boston to one near Sterling, VA. My job would be too look after this hardware once they finished and returned to the San Francisco Bay area where they normally lived & worked. It would be advantageous to work with the new systems as they were being installed, and to meet the guys I'd be working with for the next while.
It is now nine years later, and I'm still at Deem, even when all of those guys have left for other jobs. And Tuesday, April 10, 2018, I completed the decommission & demolish the cage of equipment which we had collectively installed back in late 2008.
The new cage we had assembled in May, 2017, as our new production system occupied nine racks; the old systems occupied 21. The new cage uses about 1/5 the electricity the old one did. I'm certain our air conditioning requirements are also substantially smaller.
Decommissioning is both harder and easier than one might suspect. On the good side, one no longer need be gentle with equipment, especially hard drives. I'm so accustomed to holding equipment gingerly and ensuring it receives no shocks, physical, static or otherwise. When decommissioning though, it's rather nice to toss a hard drive or drop an entire stack into a bin instead of placing them carefully into padded trays. Taking the old equipment to the electronics recycling dumpster was a lot of fun, kinda like throwing a discus but with servers.
We brought in an outside vendor to shred the hard drives so we would have a certificate of destruction for our auditors, and naturally to protect the customer data which was on them (encrypted, naturally). In all, we shredded 608 drives.
Trashing an entire cage meant as well erasing a number of embarrassments. Yes, I did my best to keep cabling tidy & colour-coded, but sometimes we needed to cut corners because of urgency or a lack of parts. And over nine years, some systems are decommissioned, some new ones added: even when starting clean, it's harder to keep things neat as systems change organically. Trashing the cage removed all of the eyesores and little compromises.
Getting rid of old equipment also means fewer future trips to the data center. Fewer hard drives and newer hard drives means the failure rate overall has plummetted. Towards the end, I was making trips to the cage every 2-3 days, but now it's once every 2-3 weeks. I have to admit though some of the drives have been working 24/7 for at least nine years --some were still the originals from the Boston data center.
There was an obvious evolution of racking kits over the past nine years. The oldest equipment had rack nuts & bolts to hold trays in place on which the equipment would sit. Then they became rails held in place with rack nuts & bolts. Then we got simple rails which locked themselves in place without requiring pre-installed nuts & bolts. Most of the equipment required the nuts & bolts, so the electric screwdriver I had became my best friend ever: without, I'd still be removing equipment today.
The hardest part, besides lifting so much equipment, was disconnecting all of the ethernet and fibre cabling. It's a simple pinch to unlock the cable from the network port, then a slight pull to remove it. But do this several thousand times and your fingers get very worn and bruised. Booted/snagless cables are the worst: I have learned a new hatred for them. Add to this the occasional scrape, scratch or cut. Merely washing my hands was agony each time. It took a week before I could hold a pen comfortably, and nearly a month for my fingers to return to normal.
It was a long & glorious ride. The original equipment worked longer & harder than we had any right to ask of it. We had some scares & nightmares, but on the whole, it all worked well. Rest in peace.
It is now nine years later, and I'm still at Deem, even when all of those guys have left for other jobs. And Tuesday, April 10, 2018, I completed the decommission & demolish the cage of equipment which we had collectively installed back in late 2008.
The new cage we had assembled in May, 2017, as our new production system occupied nine racks; the old systems occupied 21. The new cage uses about 1/5 the electricity the old one did. I'm certain our air conditioning requirements are also substantially smaller.
Decommissioning is both harder and easier than one might suspect. On the good side, one no longer need be gentle with equipment, especially hard drives. I'm so accustomed to holding equipment gingerly and ensuring it receives no shocks, physical, static or otherwise. When decommissioning though, it's rather nice to toss a hard drive or drop an entire stack into a bin instead of placing them carefully into padded trays. Taking the old equipment to the electronics recycling dumpster was a lot of fun, kinda like throwing a discus but with servers.
We brought in an outside vendor to shred the hard drives so we would have a certificate of destruction for our auditors, and naturally to protect the customer data which was on them (encrypted, naturally). In all, we shredded 608 drives.
Trashing an entire cage meant as well erasing a number of embarrassments. Yes, I did my best to keep cabling tidy & colour-coded, but sometimes we needed to cut corners because of urgency or a lack of parts. And over nine years, some systems are decommissioned, some new ones added: even when starting clean, it's harder to keep things neat as systems change organically. Trashing the cage removed all of the eyesores and little compromises.
Getting rid of old equipment also means fewer future trips to the data center. Fewer hard drives and newer hard drives means the failure rate overall has plummetted. Towards the end, I was making trips to the cage every 2-3 days, but now it's once every 2-3 weeks. I have to admit though some of the drives have been working 24/7 for at least nine years --some were still the originals from the Boston data center.
There was an obvious evolution of racking kits over the past nine years. The oldest equipment had rack nuts & bolts to hold trays in place on which the equipment would sit. Then they became rails held in place with rack nuts & bolts. Then we got simple rails which locked themselves in place without requiring pre-installed nuts & bolts. Most of the equipment required the nuts & bolts, so the electric screwdriver I had became my best friend ever: without, I'd still be removing equipment today.
The hardest part, besides lifting so much equipment, was disconnecting all of the ethernet and fibre cabling. It's a simple pinch to unlock the cable from the network port, then a slight pull to remove it. But do this several thousand times and your fingers get very worn and bruised. Booted/snagless cables are the worst: I have learned a new hatred for them. Add to this the occasional scrape, scratch or cut. Merely washing my hands was agony each time. It took a week before I could hold a pen comfortably, and nearly a month for my fingers to return to normal.
It was a long & glorious ride. The original equipment worked longer & harder than we had any right to ask of it. We had some scares & nightmares, but on the whole, it all worked well. Rest in peace.