The dictionary defines colocation as when two or more things are located together. When the term is used with respect to IT infrastructure, most IT professionals know we are talking about specific data center services. A colocation facility is a data center where multiple clients can move their servers and other equipment to improve availability, increase security, and save money.

What is the difference between colocation and the Public Cloud? One way to answer this question is to consider the difference between living in an apartment and staying at a hotel. For those who love analogies, we can say Colocation is to an apartment as the Public Cloud is to a hotel room.

The analogy is timely because the market in Connecticut for houses and apartments is booming; just as there is growing interest in colocation. The COVID-19 pandemic drove many New York City residents to the Connecticut suburbs to live in a less congested environment. This led to a shortage of affordable single-family homes. Many who would like to purchase a home are now settling for an apartment as they wait for home prices to recede.

COVID-19 also spawned the Work From Home transformation. Even as the pandemic subsides, many companies plan to have their employees continue to work remotely. Some companies have decided to downsize their offices or shutter them completely to save money while employees work from home. In these cases, colocation provides a proper home for those IT systems that are not suitable for the Cloud.

Colocation is like renting an apartment in several ways. Whether renting an apartment or collocating IT systems, the client provides the infrastructure. Client owned servers and related IT equipment are housed at the colocation data center just as tenants provide the furnishings for the apartments in which they live.

Though it is possible in both a colocation agreement and an apartment lease for the client to be billed directly for utilities, it is more common for these services to be bundled into the monthly fee.

Finally, the period of the lease is comparable for both apartment rentals and colocation agreements. Most leases for apartments, as well as colocation contracts, are signed for a period of 1 or more years.

The Public Cloud is more like staying at a hotel. Services from AWS, Azure, or Google Cloud provide processing, memory, storage and connectivity resources to the client on demand. In a similar manner, a  hotel client expects their room to be outfitted with beds, a television, a refrigerator, linens, and more.

Whether ordering Public Cloud services or making a hotel reservation, arrangements can be made in a matter of minutes. In both cases contracts can be for a day or less. A long term commitment is not required.

Hotels and Public Cloud providers offer a great deal of flexibility but occasionally there can be surprises at the end of an engagement. Though most hotel expenses are predictable, there can be some unexpected charges upon checkout. Who knew the cocktails and snacks available from the in room mini- bar would be so expensive? In a similar way, unanticipated cloud charges due to egress fees and peak hour surcharges can create budget overruns that are difficult to explain to management.

CAPS has been a leading provider of colocation services to organizations in Connecticut and New York for over twenty-five years. If you are looking for a better place for your servers please contact us.

Senior management does not like surprises; especially budget overruns. That is why colocation is so appealing to CIO’s and CTO’s at small and medium sized businesses. Recognition that the Public Cloud can be more expensive than colocation is causing some organizations to repatriate workloads. The inability to accurately predict monthly expenses is another reason companies are choosing colocation over the Public Cloud.

Public Cloud Cost Overruns Are Common

A recent survey of 750 IT professionals by Pepperdata reported one third had Public Cloud budget overages. In some cases actual monthly costs exceeded budget by as much as 40%. In 2019, NASA spent 53% more on Public Cloud expenses than budget. Much of the $30 million dollar overrun was due to unexpected data egress fees. Though going way over budget at a large federal agency may not be a career buster, the consequences are likely to be more severe for IT professionals at a small or medium sized company.

The inability to accurately predict monthly expenses is due to the pricing methodology used by Public Cloud vendors. Cloud services are billed based upon actual resource utilization. While this sounds good (you only pay for what you use) this approach can wreak havoc with budgets. Pricing algorithms are complex and monthly charges can vary a lot based on when services are used and where data flows.

Colocation Monthly Prices Are Fixed

Most colocation providers charge a fixed price for internet bandwidth services. The rate for an internet circuit with automatic failover to a backup circuit will be a fixed monthly fee based on the bandwidth (Mbps) of the circuit. Colocation customers know their internet charges will be the same from one month to the next. This is also true for monthly power and environmental charges.

Public Cloud providers typically price internet services based on the amount of data transferred during the month. Though there is often no charge for inbound data, the cost for outbound data transfer (egress) can be high. Public Cloud data transfer charges also may vary based on when data is sent and which data centers are involved in the transmission. Though pricing is based on actual network utilization, it can be very difficult to forecast Public Cloud internet costs for a given month.

Public Cloud charges for compute and storage services also can vary based on when they are utilized. Sophisticated pricing models reward off peak hour usage. In theory users can save money by accessing services during slack times. However, many clients are not willing or able to adapt to take advantage of lower rate periods. The result is higher expenses and higher variability from one month to the next.

Cost Management is an Ongoing Requirement for Public Cloud

Ongoing cost management is a requirement for Public Cloud users that colocation customers do not have to worry about. Unlike colocation, where monthly fees are the same from one month to the next, the variability of Public Cloud expenses creates an ongoing management responsibility. Many organizations assign someone the task of monitoring Public Cloud expenses each month to determine the cause of cost increases and to modify usage patterns if needed.

To address the Public Cloud cost management challenge there are a growing number of cost management and cost optimization tools. Though each of the Public Cloud providers offer free tools such as AWS Cost Explorer, Azure Cost Management, and GCP Billing these tools require trained personnel to use them effectively. Other third party tools like Harness Cloud Cost Management have more capabilities than the free Public Cloud tools. However, these advanced solutions can be expensive and also require a commitment to have a trained employee oversee their use.

There are use cases where the Public Cloud is the best IT infrastructure choice. However, just as there is a growing realization that Public Cloud may be more expensive for certain workloads, the unpredictable nature of Public Cloud monthly expenses often makes colocation the better choice.

For IT managers in Connecticut who would like to avoid the need to explain a big budget overrun to management, CAPS is pleased to offer colocation services with predictable monthly pricing from our secure data center in Shelton.

Hybrid Cloud is fast becoming the data architecture of choice. Hybrid Clouds incorporate a mix of on-premises, colocation, Public Cloud, and Private Cloud resources. Using orchestration software and networking, a flexible, optimized architecture can be built.

Characteristics of Public and Private Clouds

Public Cloud services such as AWS, Microsoft Azure, and Google Cloud offer flexibility and scalability with minimal capital expenses. Services can be brought online in minutes via self-service portals. A wide variety of processing and storage options are available. However, Public Cloud services employ a Shared Responsibility model requiring knowledge of complex and changing environments to assure adequate security. Pricing models are difficult to understand and costs can increase unexpectedly due to egress fees. Latency can also be a problem with Public Clouds as can ensuring compliance requirements are met.

Private Cloud is typically more expensive than Public Cloud but it offers better security, lower latency, and better compliance assurance. The cost of Private Cloud services is usually more transparent than Public Cloud services.

Colocation Characteristics

Colocation data centers are often selected to provide low latency services. Security and compliance are better with colocation than Public Clouds. Though capital expenses are higher than Public or Private Clouds, colocation may not require much capital expense if the IT systems to be used have already been purchased. Customized solutions, ongoing support, and predictable pricing are features of colocation. The ability to add or delete services immediately and self-service functions are not generally available with colocation.

On Premises Characteristics

On Premises solutions are sometimes preferred. Systems installed onsite are well understood and have been proven over time. Infrastructure may have already been paid for and adequate space may be available. Applications that are unique often perform best on premises and latency can be minimized. However, availability may be jeopardized due to lower levels of redundancy. On Premise solutions are not an option when organizations close their offices to Work From Home.

Types of Workloads

There are many different types of workloads. Each workload may require different infrastructure features for optimal performance. Listed below are several common workload types and the infrastructure best suited to meet specific requirements.

  1. Websites

Websites benefit from the elasticity and scalability of Public Cloud solutions. In almost all cases the Public Cloud is the best choice for hosting websites. Exceptions would be for websites that require extremely low latency, that have strict compliance demands, or that require large data outbound transfers where egress fees can become exorbitant.

  1. Financial Trading Applications

Private Cloud or Colocation services are usually preferred to provide low latency, high security, and high compliance solutions.

  1. SaaS Applications

Public Cloud services are usually the best choice for SaaS due to their low entry costs, scalability, and flexibility. In some cases, Private Cloud services are used to provide enhanced security and compliance.

  1. Workloads with Big Data Transfer Requirements

Applications that transfer large amounts of data from a database and applications transmitting large outbound video files can quickly become extremely expensive if hosted on the Public Cloud. This is because Public Cloud providers charge egress fees for outbound data transfers. Organizations are repatriating these workloads to colocation facilities to save money.

  1. Offsite Data Backup and Restoral

Offsite data backup is essential to protect against ransomware and other cyber breaches. Though the Public Cloud provides a low cost option for storing data backups and for long term data archival, the egress cost of downloading this data for tests or data restoral can become excessive. In these cases, offsite data backup is best done at a colocation facility.

Each workload has different requirements for optimal performance. The flexibility of the Hybrid Cloud architecture makes it possible to host each application on the most appropriate infrastructure. Please contact CAPS to discuss your Hybrid Cloud colocation and Private Cloud service requirements.



As the Pandemic subsides, Connecticut businesses are struggling to find workers. Information technology personnel are especially scarce in the Nutmeg State. Fortunately, colocation services can   help address these shortages.

Hard to Find and Retain Workers in Connecticut

A recent survey conducted by the Connecticut Business and Industry Association found 81% of employers in Connecticut said they are having difficulty finding and retaining workers. Employees with critical IT skills are particularly hard to come by. In fact, the Connecticut Department of Labor reported there were 1,100 fewer Information Industry Sector workers employed in our state in October 2021 than there were in October 2020.

There are several theories about why so many people have not come back to work since the Pandemic began. It was initially believed extended government benefits enabled some people to delay returning to work. Those benefits ended in September. The number who have not returned to work remains higher than expected so government income support does not fully explain current labor shortages.

The Benefits of Working From Home

Another theory is people got used to the benefits of working from home after working remotely for many months. Those who work at home can provide some level of care for children and other family members. Home workers also save the cost and time associated with commuting to and from work. In Connecticut where child care and senior care services are expensive and where commuting times of an hour or more are not unusual, the value of working from home is especially high. Those who work at home can also save money on meals and clothing and can run a load of laundry while working.

Finally, Connecticut has an older population than most other states. With a median age of 41.1 years, the state is tied with Delaware as the sixth oldest in the nation. Older employees are more likely to be able to retire and may have decided to do just that after working from home during COVID.

How Colocation Helps

Colocation makes it possible for companies to reallocate some IT infrastructure tasks. Skilled IT professionals at the colocation data center assist with IT system installation and configuration. They can set up automatic failover communications services to assure internet availability. Colocation providers also employ a variety of tools to continuously monitor power and environmental conditions such as temperature and humidity. Security is monitored 24/7/365 and data backup and restoral services may also be available. Remote hands services are typically also provided by colocation providers to check systems and recycle power when necessary.

By moving information technology systems to a colocation facility such as the CAPS data center in Shelton, companies can offload some of the work from their IT team. This makes it possible to get more work accomplished with fewer personnel and thus helps address Connecticut’s IT labor shortage.

In addition, by facilitating remote operations, colocation makes it possible for companies to staff their IT department off premise. This may be the difference between keeping or losing a key employee who would rather quit than give up the perks of working from home. It also will make it easier to hire additional IT professionals if the option of living outside of Connecticut is now a possibility.

Did you know Connecticut enacted a new cybersecurity law this month? Connecticut joined Ohio and Utah as one of just three states with legislation providing Safe Harbor protection against punitive damages when companies are sued for the consequences of a cyber breach.

Connecticut Public Act No. 21-119 went into effect on October 1st. Titled “An Act Incentivizing the Adoption of Cybersecurity Standards for Business”, the goal of the new law is to provide an incentive for organizations to take steps to reduce exposure to hackers.

The new law raises questions. To address these questions this article describes the statute at a high level and suggests how Connecticut companies who comply with the law can reduce risk and limit exposure to punitive damages. It also describes the steps required to implement one of the law’s accepted frameworks. This article does not offer legal advice. Organizations should seek qualified legal counsel where needed.

Is My Organization at Risk?

The simple answer is yes regarding the probability of a cyber breach. Every organization, large or small, is susceptible to computer breaches. Many public and private organizations in Connecticut have already suffered from ransomware attacks and other cyber crimes. The identity of victims is not always made public but recently attacks in Connecticut against 9 public school systems, 4 hospitals and even a town police department have been reported.

If you are breached, what are the odds your company will be sued and you will be subjected to punitive damages? Currently the odds of being sued are not high but there is reason to believe there will be a growing number of punitive damage suits in the future. Companies in the health care or financial services industries that handle Personal Identifiable Information (PII) and other companies that process credit card transactions should be vigilant.

What Protection Does Connecticut Public Act No. 21-119 Provide?

The new law offers protection against punitive damages if a company conforms to an industry recognized cybersecurity framework. The specific legislation states-

“In any cause of action founded in tort that is brought under the laws of this state or in the courts of this state and that alleges that the failure to implement reasonable cybersecurity controls resulted in a data breach concerning personal information or restricted information, the Superior Court shall not assess punitive damages against a covered entity if such entity created, maintained and complied with a written cybersecurity program that contains administrative, technical and physical safeguards for the protection of personal or restricted information and that conforms to an industry recognized cybersecurity framework.”

Can You Provide an Example of an Acceptable Cybersecurity Framework?

The legislation references 6 different cybersecurity frameworks it accepts. Any of these frameworks can be used to meet the statute’s requirements. To receive the law’s protection, organizations must keep current with framework releases and must not exhibit gross negligence or willful or wanton conduct.

The first approved framework listed is “The Framework for Improving Critical Infrastructure Cybersecurity” published by the National Institute of Standards and Technology. We will next describe the requirements of this framework to indicate the amount of effort needed to be compliant.

What Must Be Done to Conform to the NIST Framework?

The NIST Framework for Improving Critical Infrastructure Cybersecurity defines 5 Functions, 23 Categories, and 108 Sub-Categories. Through an on-going process of self-assessment organizations use the framework to gauge their level of cybersecurity preparedness. Users rate their cybersecurity maturity for each Sub-Category by assigning a Framework Implementation Tier ranging from 1 to 4. Then Framework Profiles for each sub-category are prepared to describe the Current Profile and Target Profile for each Sub-Category to establish goals for future improvement.

The following table provides examples of 5 different Sub-Categories (One from each Function).

FunctionCategorySub-Category IDSub-Category Description
IdentifyGovernanceID.GV-3Legal and regulatory requirements regarding cybersecurity, including privacy and civil liberties obligations are understood and managed
ProtectInformation Protection Processes and ProceduresPR.IP-4Backups of information are conducted, maintained, and tested
DetectAnomalous EventsDE.AE-5Incident alert thresholds are identified
RespondMitigationRS.MI-1Incidents are contained
RecoverCommunicationRC.CO-3Recovery activities are communicated to internal and external stakeholders as well as executive and management teams


Is There a Cost Associated?

There is no cost to access and use the NIST Framework for Improving Critical Infrastructure. Though there are consultants who can help your organization implement the framework, NIST does not require nor does it certify any consultants for this purpose.

What Should We Do Next?

We believe all Connecticut companies and non-profit organizations should consider their cyber security risk exposure. Connecticut Public Act No. 21-119 provides an impetus to assess your current vulnerabilities and to take steps to make your organization’s information assets more secure.

If you have not done so already, assign a member of your company’s management team to learn about the law and the available frameworks. Select an approved framework and use that tool to help drive your organization to continuous improvement. In addition to receiving the new law’s protections you will reduce the chances of a costly cyber breach.

If you have any questions, CAPS will be happy to share our advice at no charge. Though this law is new we have been helping Connecticut companies protect their valuable data resources for over 25 years.

Public Cloud services offer flexibility and scalability. However, Public Cloud security challenges are real. In most cases colocation offers a more secure environment.

A recent survey by Proofpoint of over 600 organizations revealed the following troubling responses-

  • 72% of the respondents said moving to the Cloud coupled with more mobile workforces has introduced new security and compliance risks to their organization
  • 75% of the respondents said the increased use of Cloud applications and services without the explicit approval of the IT department (Shadow IT) presents a significant security risk
  • 78% of the companies surveyed say employees have accidentally exposed sensitive data stored in the Public Cloud

Lightspin, an Israeli Cloud penetration testing company, recently conducted an analysis of AWS and found 46% of the 40,000 S3 buckets they reviewed appeared to be misconfigured and could be unsafe.

Let’s look at several recent major security breaches experienced by users of both Amazon Web Services and Microsoft Azure.

Recent AWS Breaches

A company that provides channel management software to the online travel industry had its AWS data breached. In November of 2020 a failure to appropriately configure AWS Simple Storage Service (S3) buckets left over 24 GB of data with over 10 million files exposed. Data going back 10 years including credit card information, email addresses, and phone numbers was compromised.

In March of 2021 over 50,000 patient records with Protected Health Information (PHI) including medical insurance identification, driver’s licenses, and passport information was leaked in Utah. The company that was attacked was providing COVID testing services and incorrectly setup AWS S3 buckets to store test results.

This past summer dozens of municipal governments had over 1 TB of data across 1.6 million files exposed. A Geographic Information System (GIS) used by these local governments was built upon AWS. Once again S3 buckets were misconfigured.

Recent Azure Breaches

It is not just AWS that suffers from security breaches. Last December an application developer left their Azure Blob Storage with 587,000 files open. Medical records, insurance documents, and other PHI were exposed.

In August, Microsoft warned thousands of Azure Cosmos DB users that their data may have been exposed. The flaw, which was remedied via a subsequent patch, could potentially allow a user to gain access to another customer’s resources. Security investigators were able to gain complete unrestricted access to the accounts and databases of several thousand Azure customers including some Fortune 500 accounts.

Enabling Public Cloud Breaches

Though criminals and disaffected employees are ultimately responsible for breaches of the Public Cloud, certain practices enable these attacks. Misconfigurations lead to Public Cloud breaches by exposing an organization’s data. The reasons why Public Cloud misconfigurations are so common include-

  1. There are many configuration options for the Public Cloud
  2. Ongoing software changes by Public Cloud providers can go unnoticed by users
  3. Lack of Public Cloud configuration expertise especially for Shadow IT projects

Though misconfigurations are the most frequent breach enabler, there are other things that can expose Public Cloud applications to an attack-

  1. Poor password practices
  2. Inadequate access restrictions
  3. Mismanaged permission controls
  4. Inactive data encryption
  5. Insufficient API security
  6. Neglected workloads

Shared Responsibility Model Can Cause Problems

Public Cloud service providers employ a Shared Responsibility Model to delineate responsibility for various components of security. Public Cloud providers take responsible for securing their Cloud infrastructure. Clients are responsible for the security of their applications. Client confusion about their specific security responsibilities can create security gaps that can be exploited.

Why Colocation is More Secure

Colocation is inherently more secure than the Public Cloud.

  • With colocation each client can configure firewalls to provide the most protection for their organization. Public Cloud firewalls are not configurable by individual clients.
  • With colocation the risks due to complex and changing configuration options are all but eliminated. Colocation clients own their systems and have much more control over any changes.
  • It is much easier to prevent exposure due to Shadow IT operations since these are not conducted on the collocated systems.
  • Colocation service providers such as CAPS, provide ongoing technical support and monitoring to make sure systems are operating properly and to detect inappropriate activities.


The dog days of summer are upon us. This season reminds us of the critical importance of data center air conditioning systems. Though intense summer heat puts server rack cooling systems to the test, protecting against rising cabinet temperatures is no longer a problem restricted to the summer. As the power densities in server racks increases, the need to carefully monitor and control temperature and humidity has become an even more important year-round concern.

Until recently most data cabinets drew 5 KW or less. Now, as cabinets are packed with more powerful processors and more compact servers, the electrical load per cabinet is increasing. It is not unusual for the power per cabinet to be twice as much as it was only a few years ago. The trend for the near term future is for power consumption per cabinet to grow by several multiples more.

More Power In Equals More Heat Out

More electrical power input produces increased heat output. Since elevated temperatures can damage critical IT systems and cause outages, it is more important than ever to manage the temperature in data centers. Relative Humidity levels can also impact service so they too must be regulated.

Colocation data centers like CAPS are experts at managing heat and humidity while powering clients’ IT systems economically. Computer Room Air Conditioning (CRAC) systems direct cooled air to the front of servers and remove heated air from the back of these systems. Cabinets are set up for Hot Aisle and Cold Aisle Containment to maximize efficiency. CRAC systems also control humidity in the data center.

Electricity is needed to run IT systems but also to power the environmental systems that control both temperature and humidity. The additional electrical power demand of today’s high performance server cabinets generates a need for more power to run the air conditioning systems to manage higher heat.

What Should Temperature and Humidity Be?

The American Society of Heating, Refrigerating and Air Conditioning Engineers recommends the following ranges for temperature and humidity in Data Centers-

Temperature                64° F to 81° F

Relative Humidity         20% to 80%

Sustained elevated temperatures can lead to component failures and outages. Low Relative Humidity can increase the likelihood of Electrostatic Discharges which can damage equipment. High Relative Humidity can lead to hydroscopic dust contamination that can cause current leakage or shorts. High humidity also leads to condensation that can cause corrosion and equipment failures.

CAPS drives itself to the highest performance standards to protect the critical IT systems placed in our data center by colocation clients. CAPS continuously monitors the temperature and humidity in its data center and adjusts its systems as required to maintain temperatures between 69° F and 72° F and Relative Humidity between 40% and 50%.

Colocation Provides the Best Environment

As servers become more powerful and generate more heat, the case is even more compelling for colocation. Many corporate data centers do not have the sophisticated environmental systems and knowledge required to protect today’s high performance servers. Colocation data center professionals have the expertise and tools needed to cost-effectively regulate temperature and humidity to protect clients’ valuable investment in Information Technology.


Connecticut suffers from an acute shortage of IT support personnel. It is hard to find and retain skilled engineers anywhere in the country. It is especially difficult here due to our high cost of living. Though many companies outsource IT support to Managed Service Providers, MSPs also struggle to hire capable and affordable technical talent. They too can use some help.

CAPS employs skilled engineers to support its colocation, data backup and restoral, disaster recovery, and business continuity clients. These engineers are experts in electrical power, air conditioning, cabling, networking, software licensing, backup and recovery, business continuity, and data security.

CAPS’ clients have found that the ongoing free support from CAPS’ professionals helps them get more done with less. Here are some recent examples of free IT support provided by CAPS’ engineers.

Electrical Power Consulting

A media company client was growing dramatically and needed to expand its colocation footprint. Its  applications are power intensive so cabinet designs must be planned carefully. The engineering team at CAPS, led by a licensed electrician, helped the client select the best cabinet layout and power distribution plan to support its immediate and future requirements.

Data Backup and Recovery Consulting

A consumer products company repatriated a large database from the Public Cloud and purchased new servers which it collocated at CAPS. While meeting with a CAPS engineer to plan for additional colocation space, the company’s data backup challenges were discussed. The engineer explained how it would be possible to restore data faster and at a much more granular level using Veeam Backup and Replication software. When the client decided to migrate to the Veeam solution, the CAPS engineering team provided training and assistance enabling a successful cutover in under a week.

Network Monitoring

A telecommunications service provider was experiencing high packet losses on some of its internet circuits. Network engineers at CAPS monitored the circuits and worked with the client’s engineers to identify a configuration problem that was quickly resolved. While working with the client to isolate the problem the CAPS engineers temporarily added bandwidth in increments to establish that the problem was not primarily due to transmission capacity.

Remote Hands

A bank needed to reseat a drive on a server. Rather than dispatch one of their own personnel they asked the CAPS engineers for help. This saved at least an hour plus the cost of driving to and from the data center. CAPS includes four hours of free remote hands services per month for its colocation clients.

Technical support is frequently a secondary consideration when selecting a colocation provider. However, many of CAPS’ clients have found the ongoing expert help included as part of a colocation engagement is a very significant benefit given the scarcity of available engineering talent.

Private Cloud Data Backup

Artificial Intelligence and other technology have made it possible for a company in a mature industry (farming) to generate rapid growth. Applying advanced technology has improved product quality, saved money, and reduced time to market. To support its expansion, the company partnered with Blue Hill/CAPS to implement a secure Private Cloud data backup solution that increases resiliency, fulfills compliance requirements, and saves money.


Growing Company Has Greater Compliance Requirements

The company’s CTO and database manager recognized their current data backup system was insufficient in terms of both functionality and security to support future growth. The company had been using Windows Server software to backup databases each week to network attached storage. There was no offsite replication and the ability to restore backups was limited to restoring an entire server at a time.


Senior management at the company is committed to driving itself to the highest standards. It is investing in its technology infrastructure and procedures to comply with standards such as GDPR, Sarbanes Oxley, and SSAE-18 (SOC 1, SOC 2). It believed its current data backup system was not robust enough to support its planned compliance requirements.


Veeam Verified Backups and Replications

The Blue Hill/CAPS team recommended its 100% Private Cloud Data Backup and Replication solution. Utilizing powerful Veeam Availability Suite software, data is copied up from the company’s production servers collocated at CAPS’ secure data center to a backup storage appliance at the CAPS Private Cloud data center.


The backups are replicated to Blue Hill’s secure Private Cloud storage facility located about 80 miles from CAPS’ data center. Blue Hill/CAPS is one of few IT service providers in the Northeast offering multiple secure Private Cloud data centers located a safe distance apart but close enough to minimize latency and maximize service.


The Veeam Availability Suite software used by Blue Hill/CAPS automates backups and replications and reports the success of each task in real-time to both the customer and the Blue Hill/CAPS team. Replication between the two data centers is over a high bandwidth circuit. The replicas are protected using AES-256 encryption (at rest and in transit) before transmission to the remote Private Cloud data storage facility.


Better Customer Service and Cost Savings

Transition to the new backup system only took a few days including the time required to train client personnel. Multiple servers with several terabytes of data were backed up locally and also replicated to the remote data center. The power of the new system was immediately apparent to the Database Manager who was thrilled with the system’s ability perform restorations on a granular level.


“Now we can restore individual schemas instead of being forced to restore an entire database. This will make it possible for us to resolve problems faster and provide much better service to our users.”


The CTO is excited about the money he believes will be saved by the new 100% Private Cloud data backup solution.


“We have already saved hundreds of thousands of dollars by repatriating our large databases from the Public Cloud. That is what motivated us to move our database servers to Blue Hill/CAPS a few years ago when we decided to use colocation to save the money we had been spending on egress fees. We expect to add to our savings with the Blue Hill/CAPS Private Cloud backup solution

Offsite Data Backup is Required

The need to backup data to an offsite location is undeniable. Ransomware and other cyber attacks are increasing dramatically. Creating timely backups that can be restored quickly is one of the only ways to minimize the potentially devastating effects of a data breach. These backups must be made frequently and they should be validated to be error free so restoral will be successful if required.

Public Cloud Data Backup

AWS, Microsoft Azure, and Google Cloud all offer a variety of Public Cloud data backup solutions. There are many other cloud service providers that market data backup services too. In general, cloud data backup services can be provisioned quickly and for low initial expense. You can easily change the amount of storage you have and can configure secondary and tertiary backups with a few keystrokes. You can even select the data center region where you want to store your backups. In addition, long term contracts are not required. For these reasons, Public Cloud Data Backup is growing in popularity.

So What’s Wrong With Public Cloud Data Backup?

  1. Security is Suspect

There are three concerns with backing up data to the Public Cloud. The first and most important concern has to do with security. Public Cloud service providers operate based on a “Shared Responsibility Model”. The Cloud provider is responsible for the security of its infrastructure but not for the workloads that run on its resources. This shared responsibility model can lead to confusion and ultimately can create security gaps that can be exploited. Misconfigured cloud data storage is frequently the way criminals compromise data backups. For example, a recent survey of over 600 U.S. companies found that 78% reported they had accidentally exposed sensitive data stored in the cloud in the past year.

Public Cloud, because it is growing so quickly, has become a primary target for cyber crooks. The number of breaches of Public Cloud data backups has increased dramatically in the past few years.

  1. Technical Support

Public Cloud services do not include ongoing technical support. Online documentation and training videos are available but technical support must be provided by an organization’s own team or by an outside technical support team (IT consultant or Managed Service Provider). Given the rapid changes in the market and the complexities of offerings, it is very important to have professional and knowledgeable technical support to assist in Public Cloud data backups. Many companies do not have experts in house and are not sure if they are getting the best advice from outside consultants.

  1. Unexpected Costs

Public Cloud egress fees are charged whenever data is moved out from the cloud. This is exactly what happens every time there is a data restoral or whenever an organization wants to test its ability to successfully restore the backup. The costs can be significant depending on the amount of data transferred out of the cloud. For many, the variability of billing each month due to data transfer fees can make Public Cloud Data Backup less attractive. It is hard to budget monthly expenses and easy for fees to grow dramatically.

CAPS’ Local Data Center Alternative

Offsite data backup to a local data center offers several advantages versus the Public Cloud alternative. First, security can be better. The security professionals at a local data center such as CAPS can configure customized security solutions and continuously monitor operations. Incorporating advanced tools such as Veeam SureBackup, the CAPS team can proactively test backups to assure they have not been compromised.

Data Centers like CAPS provide ongoing technical support to address each client’s unique needs. Even clients that get support from an IT Consultant or MSP can benefit from a second pair of eyes to help solve problems and optimize solutions. Clients who backup their data at CAPS also receive ongoing Remote Hands Support as part of their service.

Data backup to the CAPS data center is done for a fixed monthly charge. There are no egress fees or changes in the negotiated monthly rate. This makes planning possible and keeps costs within budget.

For those companies in Connecticut and the Northeast, CAPS provides optimal locations for offsite data backup. Superior data security, personalized services, and fixed monthly charges make the CAPS’ solution the best choice for offsite data backup.