You can find part 1 of this saga, including the backstory, here.
Strip malls and low-rent office/retail buildings present a number of challenges to the fledgling IT-focused company. From poor electrical infrastructure to lack of security to lack of options for broadband, the small business IT guy has his work cut out for him. Add to that the fact that management typically chooses space based on price, and you’re lucky if you even get the address before moving day, much less a voice in the selection process.
And so I found myself cramming 100TB of disk into a closet large enough for a 42U cabinet and a 2-post. And when I say large enough, I mean JUST large enough. I was able to convince them to put a wide enough door that I could maneuver the cabinet out of its wheels so I could get behind it. To add to the fun, this closet was in the office space, so noise was a factor.
Fortunately, I had a say in how the room was built. My goal in building the room was to maximize the use of space, keep the noise and cold air in, and provide a modicum of security; after all, sensitive equipment would be in this room. To that end, the builder put fiberglass insulation into the walls, and then doubled the drywall on the outside facing the office. The inside walls were drywalled and then covered in plywood lagged to the studs, providing a strong base for mounting wall-based equipment and further reducing sound. The roof was a lid consisting of solid steel sandwiched between plywood above and drywall below. A steel door with a push button combination lock and steel frame completed the room.
As a warehousing operation, shipping was the most important function of the business. Delayed shipments could result in financial penalties. Since shipping was a SaaS function, my goal was to provide the business with the power and Internet so they complete the bulk of the day’s shipping even in the event of a power outage. Installation of a generator was impossible due to the location, so I had to settle for batteries. I ended up with five UPSes in total. One Tripp Lite 3kVA and one APC 3kVA split the duties in the server cabinet, and one Tripp Lite 3kVA UPS kept the 2-post (with PoE switch for the cameras) and wall-mounted equipment alive. I also had a 1,500VA unit at each pair of shipping tables to power the shipping stations (Dell all-in-ones) and label printers. Additional battery packs were added to each unit so that a total uptime of about five hours could be achieved. That gave plenty of time to either finish the shipping day or make a decision about renting a portable generator for longer outages. So far, there has been only one significant outage during a shipping day, but the production line was able to work through it without a hitch.
Cooling for the server room was provided by a Tripp Lite SRCOOL12K portable air conditioner. The exhaust was piped into the area above the drop ceiling. While this did the job, I would have preferred a variable inverter drive unit with dual hoses for more efficiency. We investigated a mini-split, but due to property management requirements, it would have taken months and cost many thousands of dollars. The server equipment could go for well over an hour before heat buildup began to become an issue, which was enough time to open the door and use a fan. Equipment could also be shut down remotely, further reducing heat production.
In addition to the physical security, infrastructure security had to be considered as well. To that end, I deployed a physically separate network for the surveillance, access control and physical security systems. Endpoints ran with antivirus and GPO-enforced firewalls and auto-patching. Ninite Pro took care of keeping ancillary software up to date. As all of the company equipment was wired, the wireless network was physically segmented from the rest of the network for BYOD and customer use. pfBlocker was deployed on the pfSense firewalls to block incoming and outgoing traffic to countries where we did not do business, and outbound traffic was limited initially to ports 80 and 443, with additional ports added on an as-needed basis. Finally, I deployed Snort on the firewalls themselves and in various VMs to catch any intrusions if they happened.
Coming up: Monitoring and lessons learned.