Colocation has been an important IT infrastructure option for decades. Recently, as a direct response to the COVID pandemic, there is a new reason to use colocation.

COVID forced many employees to Work From Home (WFH) over the past 2 years. As WFH became more accepted, another use case for colocation has been identified. The ability to quickly and cost effectively place IT systems in a secure and conveniently located data center reduces risk when moving to a remote work environment.

More than 2 years after the onset of the pandemic, companies are changing how they work. Office leases are not being renewed. Smaller offices with flexible layouts are being set up to save money and to support hybrid work models where employees come to the office a few days a week. Some companies have completely abandoned their office to have employees work from home all the time.

For most companies, business cannot be conducted if critical computer systems are not available. The process of moving an office requires powering down IT equipment so it is vitally important to prepare a plan that minimizes disruption.

Moving An Office Can Be Risky

Planning an office move can be stressful. The final decision to not renew an office lease is often made with only a few months left on a contract. Once a move date is set, the pressure is on to take care of a multitude of tasks. To minimize the risk of disruption of critical business operations during a transition it is important to prepare a detailed plan.

Most organizations have migrated some computer workloads to the cloud. However, there are usually residual applications that are not best in the cloud. For example, database applications that require a large amount of outbound data transfer are extremely expensive when hosted in the Public Cloud due to costly egress fees. Other applications require low latency or high security and thus should be placed locally and not in the cloud.

For those applications already provisioned through a public or private cloud, the move from an office should not be disruptive. Once internet service is available at the new location, the applications may be used.

Other workloads may be suitable for the cloud but may not have been migrated yet. These applications should not be migrated to the cloud as part of the office move. It is too risky to add these types of rehosting projects to the primary task of a major office move. These workloads should be placed at the colocation facility temporarily until they can be safely migrated to the cloud at a future time.

Colocation Reduces Risk

With colocation it is possible to move workloads that are not suited for the cloud to a secure local data center. By decoupling the move of IT infrastructure from the rest of the office relocation, organizations can reduce the risk of a service interruption. Once computer systems have been placed at the colocation facility the rest of the office move can be completed at any time without concern about the day-to-day functioning of the business.

A growing number of companies in Connecticut and Westchester County planning an office down-sizing or a move to WFH have used CAPS’ colocation services to reduce risk and provide a bridge to the future.

Pictured above is the Old Drake Hill Flower Bridge. Originally built in 1892 this bridge spans the Farmington River in Simsbury, Connecticut. Exactly one hundred years after construction, cars were banned and the bridge was designated for pedestrian use only. A few years later it was decorated with flower boxes.

What are the most important factors to consider when choosing a colocation service provider? Here is a short list-

  • Redundant power
  • Reliable air conditioning to control temperature and humidity levels
  • Resilient internet connectivity with automatic failover
  • Advanced security systems
  • Remote Hands services
  • Convenient location

Location and Cost Drive Colocation Selection

Power with back-ups, multiple environmental systems, high availability internet services, security protection, and flexible support are must-have requirements for all colocation service providers. Data centers must check all these boxes to succeed in the competitive colocation business. Ultimately, the colocation facility’s location is the factor, other than cost, that dictates which data center is selected.

Which factors should be considered when choosing the location of a colocation facility? The facility should be close enough for staff visits as needed. Yet it should be far enough away to reduce the risk of the same environmental events that might impact the primary office location. The site also should be near major roads to minimize drive time. It is even better if the drive to the colocation facility is against traffic during those times when employees typically visit the data center.

It is also best if the colocation provider is powered by a different electric utility than the one that powers the primary place of work. Though the total loss of utility power is rare, the consequences of such a loss can be devastating. The probability of two separate electric utilities losing power at the same time is far less than the chance of a total outage at either one.

Finally, here in Connecticut colocation costs can vary a lot based on real-estate costs. The cost per square foot for a data center in lower Fairfield County can be 2 or 3 times higher than the cost for the same amount of space in places like Shelton where CAPS’ data center is located.

Higher Elevations Lower Risk

The data center’s elevation above sea-level is another location-based factor to consider- especially in Connecticut. Our state has many low-lying areas that are close to the shoreline, rivers and lakes. Though hurricanes and tornedos can wreak havoc here, these extreme storms are rare. Floods, whether caused by storm surges or heavy rains, are much more common. The best way to avoid floods is to locate critical IT infrastructure at higher elevations.

All things being equal, it is best to aim for higher ground when looking for a lower risk place for your critical IT infrastructure.  Connecticut, unlike our neighbors to the north, is a relatively flat state. We rank 36th in terms of the states with the highest elevation. Our tallest peak is Mount Frissell in the northwest corner of the state which is 2,379 feet above sea-level.

So why not build a data center on Mount Frissell? There are data centers at very high elevations around the world like the one in Tibet at 11,995 feet above sea-level. Though the flood risk at such heights is minimized the cost to build a data center on the top of a mountain is very expensive. Also, at higher elevations air conditioning is more expensive. This is because the air is thinner at higher elevations so more air has to flow over electronic systems in order to remove heat.  Finally, since Connecticut has few tall mountains we should probably leave Mount Frissell for our hikers.

The CAPS data center in Shelton is head and shoulders above most of the other colocation sites in Connecticut. High above the Upper Valley at 290 feet above sea-level you can look down upon the restaurants and hotels along Bridgeport Avenue and see the cars speeding along Route 8 from the top level of the parking garage that is adjacent to the data center.

The fact that CAPS’ clients have not experienced an unscheduled power outage in over 20 years is due, in part, to the location of our data center in a flood-free zone far above sea-level.

The dictionary defines colocation as when two or more things are located together. When the term is used with respect to IT infrastructure, most IT professionals know we are talking about specific data center services. A colocation facility is a data center where multiple clients can move their servers and other equipment to improve availability, increase security, and save money.

What is the difference between colocation and the Public Cloud? One way to answer this question is to consider the difference between living in an apartment and staying at a hotel. For those who love analogies, we can say Colocation is to an apartment as the Public Cloud is to a hotel room.

The analogy is timely because the market in Connecticut for houses and apartments is booming; just as there is growing interest in colocation. The COVID-19 pandemic drove many New York City residents to the Connecticut suburbs to live in a less congested environment. This led to a shortage of affordable single-family homes. Many who would like to purchase a home are now settling for an apartment as they wait for home prices to recede.

COVID-19 also spawned the Work From Home transformation. Even as the pandemic subsides, many companies plan to have their employees continue to work remotely. Some companies have decided to downsize their offices or shutter them completely to save money while employees work from home. In these cases, colocation provides a proper home for those IT systems that are not suitable for the Cloud.

Colocation is like renting an apartment in several ways. Whether renting an apartment or collocating IT systems, the client provides the infrastructure. Client owned servers and related IT equipment are housed at the colocation data center just as tenants provide the furnishings for the apartments in which they live.

Though it is possible in both a colocation agreement and an apartment lease for the client to be billed directly for utilities, it is more common for these services to be bundled into the monthly fee.

Finally, the period of the lease is comparable for both apartment rentals and colocation agreements. Most leases for apartments, as well as colocation contracts, are signed for a period of 1 or more years.

The Public Cloud is more like staying at a hotel. Services from AWS, Azure, or Google Cloud provide processing, memory, storage and connectivity resources to the client on demand. In a similar manner, a  hotel client expects their room to be outfitted with beds, a television, a refrigerator, linens, and more.

Whether ordering Public Cloud services or making a hotel reservation, arrangements can be made in a matter of minutes. In both cases contracts can be for a day or less. A long term commitment is not required.

Hotels and Public Cloud providers offer a great deal of flexibility but occasionally there can be surprises at the end of an engagement. Though most hotel expenses are predictable, there can be some unexpected charges upon checkout. Who knew the cocktails and snacks available from the in room mini- bar would be so expensive? In a similar way, unanticipated cloud charges due to egress fees and peak hour surcharges can create budget overruns that are difficult to explain to management.

CAPS has been a leading provider of colocation services to organizations in Connecticut and New York for over twenty-five years. If you are looking for a better place for your servers please contact us.

Senior management does not like surprises; especially budget overruns. That is why colocation is so appealing to CIO’s and CTO’s at small and medium sized businesses. Recognition that the Public Cloud can be more expensive than colocation is causing some organizations to repatriate workloads. The inability to accurately predict monthly expenses is another reason companies are choosing colocation over the Public Cloud.

Public Cloud Cost Overruns Are Common

A recent survey of 750 IT professionals by Pepperdata reported one third had Public Cloud budget overages. In some cases actual monthly costs exceeded budget by as much as 40%. In 2019, NASA spent 53% more on Public Cloud expenses than budget. Much of the $30 million dollar overrun was due to unexpected data egress fees. Though going way over budget at a large federal agency may not be a career buster, the consequences are likely to be more severe for IT professionals at a small or medium sized company.

The inability to accurately predict monthly expenses is due to the pricing methodology used by Public Cloud vendors. Cloud services are billed based upon actual resource utilization. While this sounds good (you only pay for what you use) this approach can wreak havoc with budgets. Pricing algorithms are complex and monthly charges can vary a lot based on when services are used and where data flows.

Colocation Monthly Prices Are Fixed

Most colocation providers charge a fixed price for internet bandwidth services. The rate for an internet circuit with automatic failover to a backup circuit will be a fixed monthly fee based on the bandwidth (Mbps) of the circuit. Colocation customers know their internet charges will be the same from one month to the next. This is also true for monthly power and environmental charges.

Public Cloud providers typically price internet services based on the amount of data transferred during the month. Though there is often no charge for inbound data, the cost for outbound data transfer (egress) can be high. Public Cloud data transfer charges also may vary based on when data is sent and which data centers are involved in the transmission. Though pricing is based on actual network utilization, it can be very difficult to forecast Public Cloud internet costs for a given month.

Public Cloud charges for compute and storage services also can vary based on when they are utilized. Sophisticated pricing models reward off peak hour usage. In theory users can save money by accessing services during slack times. However, many clients are not willing or able to adapt to take advantage of lower rate periods. The result is higher expenses and higher variability from one month to the next.

Cost Management is an Ongoing Requirement for Public Cloud

Ongoing cost management is a requirement for Public Cloud users that colocation customers do not have to worry about. Unlike colocation, where monthly fees are the same from one month to the next, the variability of Public Cloud expenses creates an ongoing management responsibility. Many organizations assign someone the task of monitoring Public Cloud expenses each month to determine the cause of cost increases and to modify usage patterns if needed.

To address the Public Cloud cost management challenge there are a growing number of cost management and cost optimization tools. Though each of the Public Cloud providers offer free tools such as AWS Cost Explorer, Azure Cost Management, and GCP Billing these tools require trained personnel to use them effectively. Other third party tools like Harness Cloud Cost Management have more capabilities than the free Public Cloud tools. However, these advanced solutions can be expensive and also require a commitment to have a trained employee oversee their use.

There are use cases where the Public Cloud is the best IT infrastructure choice. However, just as there is a growing realization that Public Cloud may be more expensive for certain workloads, the unpredictable nature of Public Cloud monthly expenses often makes colocation the better choice.

For IT managers in Connecticut who would like to avoid the need to explain a big budget overrun to management, CAPS is pleased to offer colocation services with predictable monthly pricing from our secure data center in Shelton.

Hybrid Cloud is fast becoming the data architecture of choice. Hybrid Clouds incorporate a mix of on-premises, colocation, Public Cloud, and Private Cloud resources. Using orchestration software and networking, a flexible, optimized architecture can be built.

Characteristics of Public and Private Clouds

Public Cloud services such as AWS, Microsoft Azure, and Google Cloud offer flexibility and scalability with minimal capital expenses. Services can be brought online in minutes via self-service portals. A wide variety of processing and storage options are available. However, Public Cloud services employ a Shared Responsibility model requiring knowledge of complex and changing environments to assure adequate security. Pricing models are difficult to understand and costs can increase unexpectedly due to egress fees. Latency can also be a problem with Public Clouds as can ensuring compliance requirements are met.

Private Cloud is typically more expensive than Public Cloud but it offers better security, lower latency, and better compliance assurance. The cost of Private Cloud services is usually more transparent than Public Cloud services.

Colocation Characteristics

Colocation data centers are often selected to provide low latency services. Security and compliance are better with colocation than Public Clouds. Though capital expenses are higher than Public or Private Clouds, colocation may not require much capital expense if the IT systems to be used have already been purchased. Customized solutions, ongoing support, and predictable pricing are features of colocation. The ability to add or delete services immediately and self-service functions are not generally available with colocation.

On Premises Characteristics

On Premises solutions are sometimes preferred. Systems installed onsite are well understood and have been proven over time. Infrastructure may have already been paid for and adequate space may be available. Applications that are unique often perform best on premises and latency can be minimized. However, availability may be jeopardized due to lower levels of redundancy. On Premise solutions are not an option when organizations close their offices to Work From Home.

Types of Workloads

There are many different types of workloads. Each workload may require different infrastructure features for optimal performance. Listed below are several common workload types and the infrastructure best suited to meet specific requirements.

  1. Websites

Websites benefit from the elasticity and scalability of Public Cloud solutions. In almost all cases the Public Cloud is the best choice for hosting websites. Exceptions would be for websites that require extremely low latency, that have strict compliance demands, or that require large data outbound transfers where egress fees can become exorbitant.

  1. Financial Trading Applications

Private Cloud or Colocation services are usually preferred to provide low latency, high security, and high compliance solutions.

  1. SaaS Applications

Public Cloud services are usually the best choice for SaaS due to their low entry costs, scalability, and flexibility. In some cases, Private Cloud services are used to provide enhanced security and compliance.

  1. Workloads with Big Data Transfer Requirements

Applications that transfer large amounts of data from a database and applications transmitting large outbound video files can quickly become extremely expensive if hosted on the Public Cloud. This is because Public Cloud providers charge egress fees for outbound data transfers. Organizations are repatriating these workloads to colocation facilities to save money.

  1. Offsite Data Backup and Restoral

Offsite data backup is essential to protect against ransomware and other cyber breaches. Though the Public Cloud provides a low cost option for storing data backups and for long term data archival, the egress cost of downloading this data for tests or data restoral can become excessive. In these cases, offsite data backup is best done at a colocation facility.

Each workload has different requirements for optimal performance. The flexibility of the Hybrid Cloud architecture makes it possible to host each application on the most appropriate infrastructure. Please contact CAPS to discuss your Hybrid Cloud colocation and Private Cloud service requirements.

 

 

As the Pandemic subsides, Connecticut businesses are struggling to find workers. Information technology personnel are especially scarce in the Nutmeg State. Fortunately, colocation services can   help address these shortages.

Hard to Find and Retain Workers in Connecticut

A recent survey conducted by the Connecticut Business and Industry Association found 81% of employers in Connecticut said they are having difficulty finding and retaining workers. Employees with critical IT skills are particularly hard to come by. In fact, the Connecticut Department of Labor reported there were 1,100 fewer Information Industry Sector workers employed in our state in October 2021 than there were in October 2020.

There are several theories about why so many people have not come back to work since the Pandemic began. It was initially believed extended government benefits enabled some people to delay returning to work. Those benefits ended in September. The number who have not returned to work remains higher than expected so government income support does not fully explain current labor shortages.

The Benefits of Working From Home

Another theory is people got used to the benefits of working from home after working remotely for many months. Those who work at home can provide some level of care for children and other family members. Home workers also save the cost and time associated with commuting to and from work. In Connecticut where child care and senior care services are expensive and where commuting times of an hour or more are not unusual, the value of working from home is especially high. Those who work at home can also save money on meals and clothing and can run a load of laundry while working.

Finally, Connecticut has an older population than most other states. With a median age of 41.1 years, the state is tied with Delaware as the sixth oldest in the nation. Older employees are more likely to be able to retire and may have decided to do just that after working from home during COVID.

How Colocation Helps

Colocation makes it possible for companies to reallocate some IT infrastructure tasks. Skilled IT professionals at the colocation data center assist with IT system installation and configuration. They can set up automatic failover communications services to assure internet availability. Colocation providers also employ a variety of tools to continuously monitor power and environmental conditions such as temperature and humidity. Security is monitored 24/7/365 and data backup and restoral services may also be available. Remote hands services are typically also provided by colocation providers to check systems and recycle power when necessary.

By moving information technology systems to a colocation facility such as the CAPS data center in Shelton, companies can offload some of the work from their IT team. This makes it possible to get more work accomplished with fewer personnel and thus helps address Connecticut’s IT labor shortage.

In addition, by facilitating remote operations, colocation makes it possible for companies to staff their IT department off premise. This may be the difference between keeping or losing a key employee who would rather quit than give up the perks of working from home. It also will make it easier to hire additional IT professionals if the option of living outside of Connecticut is now a possibility.

Did you know Connecticut enacted a new cybersecurity law this month? Connecticut joined Ohio and Utah as one of just three states with legislation providing Safe Harbor protection against punitive damages when companies are sued for the consequences of a cyber breach.

Connecticut Public Act No. 21-119 went into effect on October 1st. Titled “An Act Incentivizing the Adoption of Cybersecurity Standards for Business”, the goal of the new law is to provide an incentive for organizations to take steps to reduce exposure to hackers.

The new law raises questions. To address these questions this article describes the statute at a high level and suggests how Connecticut companies who comply with the law can reduce risk and limit exposure to punitive damages. It also describes the steps required to implement one of the law’s accepted frameworks. This article does not offer legal advice. Organizations should seek qualified legal counsel where needed.

Is My Organization at Risk?

The simple answer is yes regarding the probability of a cyber breach. Every organization, large or small, is susceptible to computer breaches. Many public and private organizations in Connecticut have already suffered from ransomware attacks and other cyber crimes. The identity of victims is not always made public but recently attacks in Connecticut against 9 public school systems, 4 hospitals and even a town police department have been reported.

If you are breached, what are the odds your company will be sued and you will be subjected to punitive damages? Currently the odds of being sued are not high but there is reason to believe there will be a growing number of punitive damage suits in the future. Companies in the health care or financial services industries that handle Personal Identifiable Information (PII) and other companies that process credit card transactions should be vigilant.

What Protection Does Connecticut Public Act No. 21-119 Provide?

The new law offers protection against punitive damages if a company conforms to an industry recognized cybersecurity framework. The specific legislation states-

“In any cause of action founded in tort that is brought under the laws of this state or in the courts of this state and that alleges that the failure to implement reasonable cybersecurity controls resulted in a data breach concerning personal information or restricted information, the Superior Court shall not assess punitive damages against a covered entity if such entity created, maintained and complied with a written cybersecurity program that contains administrative, technical and physical safeguards for the protection of personal or restricted information and that conforms to an industry recognized cybersecurity framework.”

Can You Provide an Example of an Acceptable Cybersecurity Framework?

The legislation references 6 different cybersecurity frameworks it accepts. Any of these frameworks can be used to meet the statute’s requirements. To receive the law’s protection, organizations must keep current with framework releases and must not exhibit gross negligence or willful or wanton conduct.

The first approved framework listed is “The Framework for Improving Critical Infrastructure Cybersecurity” published by the National Institute of Standards and Technology. We will next describe the requirements of this framework to indicate the amount of effort needed to be compliant.

What Must Be Done to Conform to the NIST Framework?

The NIST Framework for Improving Critical Infrastructure Cybersecurity defines 5 Functions, 23 Categories, and 108 Sub-Categories. Through an on-going process of self-assessment organizations use the framework to gauge their level of cybersecurity preparedness. Users rate their cybersecurity maturity for each Sub-Category by assigning a Framework Implementation Tier ranging from 1 to 4. Then Framework Profiles for each sub-category are prepared to describe the Current Profile and Target Profile for each Sub-Category to establish goals for future improvement.

The following table provides examples of 5 different Sub-Categories (One from each Function).

FunctionCategorySub-Category IDSub-Category Description
IdentifyGovernanceID.GV-3Legal and regulatory requirements regarding cybersecurity, including privacy and civil liberties obligations are understood and managed
ProtectInformation Protection Processes and ProceduresPR.IP-4Backups of information are conducted, maintained, and tested
DetectAnomalous EventsDE.AE-5Incident alert thresholds are identified
RespondMitigationRS.MI-1Incidents are contained
RecoverCommunicationRC.CO-3Recovery activities are communicated to internal and external stakeholders as well as executive and management teams

 

Is There a Cost Associated?

There is no cost to access and use the NIST Framework for Improving Critical Infrastructure. Though there are consultants who can help your organization implement the framework, NIST does not require nor does it certify any consultants for this purpose.

What Should We Do Next?

We believe all Connecticut companies and non-profit organizations should consider their cyber security risk exposure. Connecticut Public Act No. 21-119 provides an impetus to assess your current vulnerabilities and to take steps to make your organization’s information assets more secure.

If you have not done so already, assign a member of your company’s management team to learn about the law and the available frameworks. Select an approved framework and use that tool to help drive your organization to continuous improvement. In addition to receiving the new law’s protections you will reduce the chances of a costly cyber breach.

If you have any questions, CAPS will be happy to share our advice at no charge. Though this law is new we have been helping Connecticut companies protect their valuable data resources for over 25 years.

Public Cloud services offer flexibility and scalability. However, Public Cloud security challenges are real. In most cases colocation offers a more secure environment.

A recent survey by Proofpoint of over 600 organizations revealed the following troubling responses-

  • 72% of the respondents said moving to the Cloud coupled with more mobile workforces has introduced new security and compliance risks to their organization
  • 75% of the respondents said the increased use of Cloud applications and services without the explicit approval of the IT department (Shadow IT) presents a significant security risk
  • 78% of the companies surveyed say employees have accidentally exposed sensitive data stored in the Public Cloud

Lightspin, an Israeli Cloud penetration testing company, recently conducted an analysis of AWS and found 46% of the 40,000 S3 buckets they reviewed appeared to be misconfigured and could be unsafe.

Let’s look at several recent major security breaches experienced by users of both Amazon Web Services and Microsoft Azure.

Recent AWS Breaches

A company that provides channel management software to the online travel industry had its AWS data breached. In November of 2020 a failure to appropriately configure AWS Simple Storage Service (S3) buckets left over 24 GB of data with over 10 million files exposed. Data going back 10 years including credit card information, email addresses, and phone numbers was compromised.

In March of 2021 over 50,000 patient records with Protected Health Information (PHI) including medical insurance identification, driver’s licenses, and passport information was leaked in Utah. The company that was attacked was providing COVID testing services and incorrectly setup AWS S3 buckets to store test results.

This past summer dozens of municipal governments had over 1 TB of data across 1.6 million files exposed. A Geographic Information System (GIS) used by these local governments was built upon AWS. Once again S3 buckets were misconfigured.

Recent Azure Breaches

It is not just AWS that suffers from security breaches. Last December an application developer left their Azure Blob Storage with 587,000 files open. Medical records, insurance documents, and other PHI were exposed.

In August, Microsoft warned thousands of Azure Cosmos DB users that their data may have been exposed. The flaw, which was remedied via a subsequent patch, could potentially allow a user to gain access to another customer’s resources. Security investigators were able to gain complete unrestricted access to the accounts and databases of several thousand Azure customers including some Fortune 500 accounts.

Enabling Public Cloud Breaches

Though criminals and disaffected employees are ultimately responsible for breaches of the Public Cloud, certain practices enable these attacks. Misconfigurations lead to Public Cloud breaches by exposing an organization’s data. The reasons why Public Cloud misconfigurations are so common include-

  1. There are many configuration options for the Public Cloud
  2. Ongoing software changes by Public Cloud providers can go unnoticed by users
  3. Lack of Public Cloud configuration expertise especially for Shadow IT projects

Though misconfigurations are the most frequent breach enabler, there are other things that can expose Public Cloud applications to an attack-

  1. Poor password practices
  2. Inadequate access restrictions
  3. Mismanaged permission controls
  4. Inactive data encryption
  5. Insufficient API security
  6. Neglected workloads

Shared Responsibility Model Can Cause Problems

Public Cloud service providers employ a Shared Responsibility Model to delineate responsibility for various components of security. Public Cloud providers take responsible for securing their Cloud infrastructure. Clients are responsible for the security of their applications. Client confusion about their specific security responsibilities can create security gaps that can be exploited.

Why Colocation is More Secure

Colocation is inherently more secure than the Public Cloud.

  • With colocation each client can configure firewalls to provide the most protection for their organization. Public Cloud firewalls are not configurable by individual clients.
  • With colocation the risks due to complex and changing configuration options are all but eliminated. Colocation clients own their systems and have much more control over any changes.
  • It is much easier to prevent exposure due to Shadow IT operations since these are not conducted on the collocated systems.
  • Colocation service providers such as CAPS, provide ongoing technical support and monitoring to make sure systems are operating properly and to detect inappropriate activities.

 

The dog days of summer are upon us. This season reminds us of the critical importance of data center air conditioning systems. Though intense summer heat puts server rack cooling systems to the test, protecting against rising cabinet temperatures is no longer a problem restricted to the summer. As the power densities in server racks increases, the need to carefully monitor and control temperature and humidity has become an even more important year-round concern.

Until recently most data cabinets drew 5 KW or less. Now, as cabinets are packed with more powerful processors and more compact servers, the electrical load per cabinet is increasing. It is not unusual for the power per cabinet to be twice as much as it was only a few years ago. The trend for the near term future is for power consumption per cabinet to grow by several multiples more.

More Power In Equals More Heat Out

More electrical power input produces increased heat output. Since elevated temperatures can damage critical IT systems and cause outages, it is more important than ever to manage the temperature in data centers. Relative Humidity levels can also impact service so they too must be regulated.

Colocation data centers like CAPS are experts at managing heat and humidity while powering clients’ IT systems economically. Computer Room Air Conditioning (CRAC) systems direct cooled air to the front of servers and remove heated air from the back of these systems. Cabinets are set up for Hot Aisle and Cold Aisle Containment to maximize efficiency. CRAC systems also control humidity in the data center.

Electricity is needed to run IT systems but also to power the environmental systems that control both temperature and humidity. The additional electrical power demand of today’s high performance server cabinets generates a need for more power to run the air conditioning systems to manage higher heat.

What Should Temperature and Humidity Be?

The American Society of Heating, Refrigerating and Air Conditioning Engineers recommends the following ranges for temperature and humidity in Data Centers-

Temperature                64° F to 81° F

Relative Humidity         20% to 80%

Sustained elevated temperatures can lead to component failures and outages. Low Relative Humidity can increase the likelihood of Electrostatic Discharges which can damage equipment. High Relative Humidity can lead to hydroscopic dust contamination that can cause current leakage or shorts. High humidity also leads to condensation that can cause corrosion and equipment failures.

CAPS drives itself to the highest performance standards to protect the critical IT systems placed in our data center by colocation clients. CAPS continuously monitors the temperature and humidity in its data center and adjusts its systems as required to maintain temperatures between 69° F and 72° F and Relative Humidity between 40% and 50%.

Colocation Provides the Best Environment

As servers become more powerful and generate more heat, the case is even more compelling for colocation. Many corporate data centers do not have the sophisticated environmental systems and knowledge required to protect today’s high performance servers. Colocation data center professionals have the expertise and tools needed to cost-effectively regulate temperature and humidity to protect clients’ valuable investment in Information Technology.

 

Connecticut suffers from an acute shortage of IT support personnel. It is hard to find and retain skilled engineers anywhere in the country. It is especially difficult here due to our high cost of living. Though many companies outsource IT support to Managed Service Providers, MSPs also struggle to hire capable and affordable technical talent. They too can use some help.

CAPS employs skilled engineers to support its colocation, data backup and restoral, disaster recovery, and business continuity clients. These engineers are experts in electrical power, air conditioning, cabling, networking, software licensing, backup and recovery, business continuity, and data security.

CAPS’ clients have found that the ongoing free support from CAPS’ professionals helps them get more done with less. Here are some recent examples of free IT support provided by CAPS’ engineers.

Electrical Power Consulting

A media company client was growing dramatically and needed to expand its colocation footprint. Its  applications are power intensive so cabinet designs must be planned carefully. The engineering team at CAPS, led by a licensed electrician, helped the client select the best cabinet layout and power distribution plan to support its immediate and future requirements.

Data Backup and Recovery Consulting

A consumer products company repatriated a large database from the Public Cloud and purchased new servers which it collocated at CAPS. While meeting with a CAPS engineer to plan for additional colocation space, the company’s data backup challenges were discussed. The engineer explained how it would be possible to restore data faster and at a much more granular level using Veeam Backup and Replication software. When the client decided to migrate to the Veeam solution, the CAPS engineering team provided training and assistance enabling a successful cutover in under a week.

Network Monitoring

A telecommunications service provider was experiencing high packet losses on some of its internet circuits. Network engineers at CAPS monitored the circuits and worked with the client’s engineers to identify a configuration problem that was quickly resolved. While working with the client to isolate the problem the CAPS engineers temporarily added bandwidth in increments to establish that the problem was not primarily due to transmission capacity.

Remote Hands

A bank needed to reseat a drive on a server. Rather than dispatch one of their own personnel they asked the CAPS engineers for help. This saved at least an hour plus the cost of driving to and from the data center. CAPS includes four hours of free remote hands services per month for its colocation clients.

Technical support is frequently a secondary consideration when selecting a colocation provider. However, many of CAPS’ clients have found the ongoing expert help included as part of a colocation engagement is a very significant benefit given the scarcity of available engineering talent.