header-about-us-sub

3 Trends in Data Center Cooling

Melting-ice-cubes.jpgData center power consumption is evolving all the time, becoming more efficient but, generally, growing.  While many data centers are making green initiatives and finding ways to make their energy usage as efficient as possible, data demands are constantly growing, rack density is being increased and the need for effective cooling growing along with it.  There are many approaches to data center cooling and even single data centers are implementing a variety of approaches to best cool their data center.  Large data centers from companies like Yahoo or Apple are setting the trend in green data center cooling initiatives and small data centers are not only taking note but are also implementing those trends in their own data centers.  Below are 5 exciting trends in data center cooling.

  1. Liquid Cooling
    • Using liquid to cool, instead of air, is a great way to cool higher density racks and can be used in a variety of ways in the data center. TechTarget elaborates on the use of liquid cooling in data centers, “Now, new technologies can put 250 kW in a single rack, using liquid immersion cooling to play an important role for certain systems, such as high-performance computing, Cecci said. The pluses of liquid cooling include the ability to deploy it in specific areas — by row and rack — and it is very quiet and reliable, with few moving parts. Despite its benefits, liquid cooling is not in many data centers today, he said. “Most of these technologies — we will see them in the next two to three years,” Cecci said.”
  2. CRAC
    • CRAC (computer room air conditioner) cooling systems has been used for a considerable amount of time in data centers. While they may be the old standard, CRAC has continued to evolve over time with new strategies to be an effective form of cooling in data centers, which TechTarget explains, “The easiest way to save money is to reduce the number of running CRAC units. If half the amount of cooling is required, turning off half the CRAC units will give a direct saving in energy costs — and in maintenance costs. Using variable-speed instead of fixed-speed CRAC units is another way to accomplish this, where the units run only at the speed required to maintain the desired temperature. The units run at their most effective levels only when they run at 100%, and some variable speed systems don’t run at a fully optimized rate when operating at partial load. Running standard, fixed-rate CRAC units in such a way as to build up “thermal inertia” can be cost- effective. Here, the data center is cooled considerably below the target temperature by running the units, and then they are turned off. The data center then is allowed to warm up until it reaches a defined point, and the CRAC units are turned back on.”
  3. Bypass Air
    • Bypass air is any air that is conditioned that does not pass through IT equipment before returning to the cooling unit. In essence, it is a waste of cooled air which has led many data centers to make efforts to reduce the problem.  Data Center Dynamics explains the problem and how data centers are fixing it to improve data center cooling, “The velocity of the cool air stream exceeds the ability of the server fans to draw in the cool air; as a result the cool air shoots beyond the face of the IT rack.  Cool supply air can join the return air stream before passing through servers, weakening cooling efficiency. Eager to combat the inefficiencies above and keep pace with steadily climbing data center temperatures, businesses often adopt hot aisle/cold aisle rack orientation arrangements, in which only hot air exhausts and cool air intakes face each other in a given row of server racks. Such configurations generate convection currents that produce improved airflow. Although superior to chaos air distribution, hot aisle/cold aisle strategies have proven only marginally more capable of cooling today’s increasingly dense data centers, largely because both approaches ultimately share a common, fatal flaw: They allow air to move freely throughout the data center. This flaw eventually led to the introduction of containment cooling strategies. Designed to organize and control air streams, containment solutions enclose server racks in sealed structures that capture hot exhaust air, vent it to the CRAC units and then deliver chilled air directly to the server equipment’s air intakes.”
Posted in Computer Room Design, data center cooling, Data Center Design | Tagged , , , | Comments Off

Importance of Renewable Energy in Data Centers

Advanced PDU

“Renewable energy.”  “Clean energy.”  These may sound like buzzwords – trendy little catchphrases meant to grab your attention and sound good but they are far more than buzzwords, they are the reality and the future of data centers.  As we make pushes to become sustainable in all industries one of the biggest focuses will likely be towards sustainable, renewable energy in the data center.  Large data centers use exponential amounts of energy, equivalent to the energy some small cities use, so it only goes to logic that there would be a push to make that energy usage as clean as possible.  So, just how critical will it be that data centers focus on sustainability through renewable and clean energy going forward? Very.  In fact, Data Center Knowledge notes that a study was completed and what consumers want now and going forward are data centers focused on sustainability, “A recent survey of consumers of retail colocation and wholesale data center services by Data Center Knowledge, found that 70 percent of these users consider sustainability issues when selecting data center providers. About one-third of the ones that do said it was very important that their data center providers power their facilities with renewable energy, and 15 percent said it was critical.Most respondents said their interest in data centers powered by renewable energy would increase over the next five years. More than 60 percent have an official sustainability policy, while 25 percent are considering developing one within the next 18 months.”

As data center space across the globe continues to rapidly grow, so will the amount of energy used.  That energy use is often not only bad for the environment but quite costly.  We have already seen large companies like Google and Apple focus on renewable energy and, as we often see, smaller data centers will likely follow in their footsteps.  Small and large data centers are undertaking renovations and making changes towards renewable energy because even the tiniest improvements in efficiency and sustainability are saving big bucks.  What do these renewable energy efforts look like?  There are a vast array of options and approaches but Data Center Frontier elaborates on a few, “In broad terms, “clean” or “green” energy comes from renewable sources such as the sun (solar), wind, the movement of water in rivers and oceans (hydroelectricity), biofuels (fuel derived from organic matter), and geothermal activity. Today, there are big trends showing that tech giants are moving towards renewable energy sources in their green data centers. Digital Realty, along with certain major technology companies and other pioneers, are showing that clean energy can be used to power even the largest and most high-performance data centers. And as more organizations consider moving from traditional to cleaner sources of power, they are also showing that renewable energy can be cost-effective.”  This is not a fleeting trend.  Renewable energy is here to stay, it is the future of data centers, and all data centers should b emaking efforts to make small and big changes towards renewable energy for the future.

 

Posted in Data Center Infrastructure Management | Comments Off

Are Data Center Silos Interfering With Growth?

Networking communication technology concept, network and internet telecommunication equipment in server room, data center interior with computers and Earth globe in blue light

Every business can experience information and data silos on some level, particularly when various applications and systems must communicate with each other.  But, these silos are particularly evident, costly, and problematic in data centers.  Data Center Knowledge offers one example of the type of silo that can occur in a data center, “Electrical and mechanical systems in data centers are a perfect example of legacy IoT, he says, operating in silos, isolated from the IT systems they support. That isolation is the decades-old legacy, used to this day as the only method of securing these critical systems from intrusion.”  Not only are there electrical and mechanical silos in the data center, but data/information silos as well.  As certain tools and information are used in various ways by assorted applications, data silos emerge and become increasing problematic.  So often, we see that when information silos occur, duplication of information and processes occur which takes up more space and thus uses more energy.  Silos in data centers are truly a drain on resources.

The problem of data center silos is further exacerbated by the fact that, often, one company may have multiple data centers in locations all over the world.  To avoid data center silos in both large and small data centers, there must be collaboration and open lines of communication.  Fortunately, many data centers are opting for convergence rather than expansion, finding ways to use existing space in a more effective and efficient way.  This alone, will help reduce information silos.  Data Center Knowledge explains how convergence is being actively applied in data centers and elaborates on the advantages to be found when data centers opt for convergence, “As we saw with many Datalink enterprise customers who moved from the silo model, IT data centers first began to incorporate server virtualization technology to logically represent multiple servers on one or more consolidated, physical systems with smaller data center footprints. The data centers also began to incorporate their own dedicated network and shared storage to support them.Suddenly, one physical server could be used to serve up the needs of multiple applications, which it often did with glowing results. But, a ripple effect of virtual server growth often expanded storage and network needs significantly. Suddenly, cost savings in one area could be offset by growing expenses in another…

The benefits for IT can lead to:

  • Less moving parts (and less individual vendor touch points) to manage or troubleshoot
  • Greater resource utilization at a lower cost
  • Faster application provisioning (one enterprise customer went from their prior three weeks to just 15 minutes to provision new applications)
  • Faster IT response to business priority changes or changing market conditions
  • Easier scaling and greater elasticity of the infrastructure
  • Related integration and cross-training of previously siloed IT teams, themselves, in order to align IT further to the business
  • A shorter pathway to on-demand services or private cloud environments to meet the IT needs of internal business units

A lower cost to support the growing data and application needs of the business (Another enterprise customer who runs its own SaS business found itself able to offer better quality services to current and new customers at a lower overall cost to itself.)”

Posted in Data Center Build, Data Center Design | Comments Off

Should Enterprises Keep IT In House or Outsource Colocation?

Data center with network servers in futuristic room.

Enterprise data centers may be a dying breed.  Today we are seeing more and more data centers opt for colocation over enterprise data centers because of the high cost and level of expertise needed to run an enterprise data center. Additionally, cloud service providers are mitigating the need for enterprise data centers.  Data Center Journal explains the basic appeal of colocation to enterprises, “Many businesses don’t have the time and money to invest in the equipment, technology, security and staff to run a full data center. For those businesses, colocation can help them optimize their department and free up resources, giving employees the time and bandwidth to focus on more strategic business tasks. Colocation facilities provide the space, cooling, power and security for your server, storage and networking equipment, while giving IT manager’s access to high bandwidth, low latency and always-on connections… Large colocation facilities also offer significant benefits of scale. By utilizing large power and mechanical systems, the facility can provide high uptimes and speed as well as the ability to efficiently appropriate additional resources and quickly grow alongside your company.”

Colocation is certainly not free, it comes at a cost (particularly at the beginning) and provides its own set of security risks.  But, while enterprise data centers will never completely disappear but they are certainly fading, and rapidly.  Data Center Knowledge discusses the trend of moving away from enterprise data centers towards colocation and cloud services, “As Liz Cruz, associate director with the market research firm IHS and the panel’s moderator, pointed out, hardware and infrastructure equipment sales into data centers are declining, while revenue colocation providers are raking in is growing in double digits, which means more and more companies choose outsourcing over their own data centers. Still, when she asked people in the audience to raise their hands if their companies had at least two-thirds of their IT capacity in colocation data centers, only a handful did. It’s cloud providers who are driving a lot of the revenue growth for colo companies – a lot more than enterprises, although enterprise data center spending is slowly waning. “Cloud providers are now the largest tenant of multitenant data center facilities,” Cruz said… For colocation providers, these hard-nosed enterprise users are not only a big growth opportunity; it’s a matter of longevity. The race to capture the hearts and minds of the enterprise is on, but they’re not only racing each other. They’re also racing the Amazon, Microsoft, and a few others. Most colo providers have embraced public cloud as reality and have been using their ability to provide direct network access to cloud services from their facilities as a way to attract enterprises, pitching customers on the hybrid cloud, where a physical footprint the customer has full control of is supplemented with public cloud services, all under one roof in a colocation facility.”  If the trend continues, as we suspect it will, colocation data centers will continue to grow and work towards integration with cloud services to draw more and more businesses away from enterprise data centers.

 

 

Posted in Cloud Computing, computer room maintenance, Data Center Build | Tagged , , , | Comments Off

Data in Motion vs. Data at Rest

Business Data Centers
The tech industry loves to uses catchy phrases to describe various processes, innovations and aspects in data centers.  Every now and then, we think it is important to narrow in on those phrases and explore what they mean and how they impact data center operations.  One of those phrases is “data in motion vs. data at rest.”  Data in motion vs. data at rest are somewhat self-explanatory but their nuances and impact on data centers are not.  Data in motion is data that is actively being used by data centers, it is data in transit.  Data at rest is data that is not being actively used but is stored in a data center. These two different types of data present unique security challenges.  For example, data in motion may be in transit on the internet and that presents different security challenges than data at rest that, while not actively in use, may contain sensitive customer information.

When it comes to securing data of any kind there are a variety of ways to prevent security breaches and cyber-attacks.  Encryption is certainly a must and data at rest must be secured on a number of levels because it may be stored in multiple places, including databases, storage networks, file servers, or virtually in the cloud.  Data in motion is exposed to cyber-attacks at a number of points while in transit.  DataMotion explains what risks may be encountered and describes some best practices when it comes to securing data, “Data is at rest when it is stored on a hard drive. In this relatively secure state, information is primarily protected by conventional perimeter-based defenses such as firewalls and anti-virus programs. However, these barriers are not impenetrable. Organizations need additional layers of defense to protect sensitive data from intruders in the event that the network is compromised. Encrypting hard drives is one of the best ways to ensure the security of data at rest. Other steps can also help, such as storing individual data elements in separate locations to decrease the likelihood of attackers gaining enough information to commit fraud or other crimes…Data is at its most vulnerable when it is in motion, and protecting information in this state requires specialized capabilities. Our expectation of immediacy dictates that a growing volume of sensitive data be transmitted digitally— forcing many organizations to replace couriers, faxes, and conventional mail service with faster options such as email. Looking ahead, it will also become increasingly important for the encryption service your organization uses to cover mobile email applications. The Radicati Group1 predicts that 80% of email users will access their accounts via mobile devices by 2018, but more than 35% of organizations currently using email encryption say their users currently lack the ability to send secure messages from their mobile email client.”  Securing data will be a challenge that will never leave, data centers must look forward, anticipate potential future threats, and use multi-level encryption to ensure that data in motion and data at rest remain protected.

 

 

Posted in Data Center Infrastructure Management, Data Center Security | Tagged | Comments Off

The Internet of Things and How It Impacts Data Centers

97806468

From time to time we see new “catchphrases” or terminology pop up in the tech world and suddenly they are being used everywhere.  One of these phrases is “the internet of things.”  As we see our world become increasing automated, digitized and more, we see that many aspects of our day to day life are now controlled by or taking place on the internet.  Forbes explains exactly what “the internet of things” means, “Simply put, this is the concept of basically connecting any device with an on and off switch to the Internet (and/or to each other). This includes everything from cellphones, coffee makers, washing machines, headphones, lamps, wearable devices and almost anything else you can think of.  This also applies to components of machines, for example a jet engine of an airplane or the drill of an oil rig. As I mentioned, if it has an on and off switch then chances are it can be a part of the IoT.  The analyst firm Gartner says that by 2020 there will be over 26 billion connected devices… That’s a lot of connections (some even estimate this number to be much higher, over 100 billion).  The IoT is a giant network of connected “things” (which also includes people).  The relationship will be between people-people, people-things, and things-things.”

The reality is, whether people like it or not, the internet of things is taking over and our world is being powered by the internet.  This impacts our day to day life in different ways but one thing data centers know is that it means more data.  Everything that is becoming part of the internet of things involves data.  That coffee machine that is connected to the internet is going to require data communication and storage on some small level.  And, if a jet engine is connected, it is probably using a lot of data.  As more things become part of the collection of the internet of things, data center demands exponentially increase. Data centers must begin to prepare now because as tie marches on, the internet of things will only increase. Data Center Dynamics points out just how much this will impact data centers going forward, “The internet of things will force enterprise data center operators to completely rethink the way they manage capacity across all layers of the IT stack, according to a recent report by the market research firm Gartner… Where this becomes problematic for data centers is management of security, servers, storage and network, Joe Skorupa, VP and distinguished analyst at Gartner, said. “Data center managers will need to deploy more forward-looking capacity management in these areas to be able to proactively meet the business priorities associated with IoT,” he said in a statement.  For data center networks, the internet of things will basically mean a lot more incoming traffic. WAN links in data centers today are designed for “moderate” bandwidth requirements of human interaction with applications.Data from multitudes of sensors will require a lot more bandwidth than current capacity… Of course a lot more data will mean a lot more storage will have to be provisioned in data centers. In addition to pure capacity, companies will have to focus on being able to get and use data generated by the internet of things cost effectively.Because of the volume of data and the amount of network connections that carry it, there will be more need for distributed data center management and appropriate system management platforms.”  While end-users may take for granted the convenience of the internet of things, data centers do not have the luxury of taking it for granted.  They must be vigilant in preparation and expansion to ensure they can accommodate the dramatically growing data needs that the internet of things presents.

Posted in Internet of Things | Comments Off

The Cost of a Data Center Security Breach

datacentersecurity2

If there is one thing that a data center is concerned with, aside from maximizing uptime, it is security.  In today’s world we constantly hear news stories about security breaches exposing businesses and individuals to danger such as identity theft, information loss, other theft, and more.  Security breaches are not just an embarrassing frustration; they are a costly one as well.  Large businesses can obviously suffer significant losses but the losses experienced by small and medium-sized businesses are significant as well. Security Intelligence describes the growing risk of security breaches, “Every corner of the organization — from human resources to operations to marketing — is generating, acquiring, processing, storing and sharing more data every day. Cybersecurity threats have conditioned organizations to defend the full depth of this sensitive information and infrastructure from a global threat landscape…IBM and Ponemon Institute are pleased to release the “2015 Cost of Data Breach Study: Global Analysis.” According to our research, the average total cost of a data breach for the participating companies increased 23 percent over the past two years to $3.79 million.”  This growing problem and increasing cost is a clear signal that data centers and business must all pay careful attention to security measures to ensure that data is properly protected.

While upping your security protection will certainly involve an up-front investment, if you are protecting critical information such as health, financial, social security, or other top secret records, the cost of a breach will be far more than the cost of protection.  For example, cloud security may be ideal for less critical information (i.e. social media) but higher security protection is better for more sensitive information. If you have a smaller business and data center, you may think the risk of a security breach is smaller, statistics are showing that, in general, security breaches are a growing reality for many.  Data Center Knowledge points out the frequency of security breaches,” Roughly half of businesses in the U.S. (49 percent) and globally (52 percent) assume that their IT security will be breached sooner or later. This is a recognition of reality, as 77 percent of U.S. businesses and 82 percent globally have experienced between 1 and 5 separate data security incidents in the last year.”  Data Center Knowledge also notes that smaller and medium businesses that experience a security breach typically incur a loss, on average, of $86,500.  And the cost of liability for data breaches is growing, emphasizing the importance of protecting their customer’s and user’s private information, which Data Center Knowledge points out, “There’s legislation brewing that would make organizations far more accountable for breaches of personal information and require them to pay actual damages to individuals, something he thinks will reverse the trend toward cloud and colocation back to in-house.”

Posted in Data Center Security | Comments Off

Advantages of Data Center Consolidation

Trouble in data center

In today’s data center world, there is a lot of discussion over increasing rack density, utilizing the space you have without having to relocate, and more.  Working with the space you have to accommodate growing data centers needs and increasing infrastructure demands may require some creative thinking but consolidation can be extremely beneficial.  Data center square footage does not come at a cheap price and running large data centers or multiple data centers uses a lot of energy and manpower.  Because of this, many data center managers are looking more closely at ways they can consolidate within their data center, or within their network of data centers, to save on the cost of overhead and energy use.

First, it is important to look at organizations that have multiple data centers.  This can happen as a result of businesses acquiring other organizations that have existing data centers in place, or it can happen from gradual expansion of needs.  During growth, it can seem or even actually be less expensive to simply keep those additional data centers open but, in the long run, it will not be.  Separate data centers require separate energy usage, separate rent/mortgage, separate personnel, separate infrastructure and more.  Those things add up over time and often what businesses find is that there are unnecessary redundancies that can be improved and solved with consolidation.  The obvious concern with consolidation is downtime.  Downtime can lead to loss of critical data, loss of money, and general frustration.  Data Center Knowledge explains why consolidation is often the better choice, and what three areas to look at when beginning to consolidate, “In many cases, creating better efficiency and a more competitive data center revolves around consolidating data center resources. With that in mind, we look at three key areas that managers should look at when it comes to data center consolidation. This includes your hardware, software, and the users… There are so many new kinds of tools we can use to consolidate services, resources, and physical data center equipment. Solutions ranging for advanced software-defined technologies to new levels of virtualization help create a much more agile data center architecture… The software piece of the data center puzzle is absolutely critical. In this case, we’re talking about management and visibility. How well are you able to see all of your resources? What are you doing to optimize workload delivery? Because business is now directly tied to the capabilities of IT, it’s more important than ever to have proactive visibility into both the hardware and software layers of the modern data center.Having good management controls spanning virtual and physical components will allow you control resources and optimize overall performance… Data center consolidation must never negatively impact the user experience. Quite the opposite; a good consolidation project should actually improve overall performance and how the user connects. New technologies allow you to dynamically control and load-balance where the user gets their resources and data. New WAN control mechanisms allow for the delivery or rich resources from a variety of points. For the end-user, the entire process is completely transparent. For the data center, you have less resource requirements by leveraging cloud, convergence, and other optimization tools.”  Every data center that has grown over time or has a network of data centers should carefully consider where consolidation can occur to save money, improve efficiency and improve overall quality of service.

Posted in Data Center Design, Data Center Infrastructure Management | Comments Off

Is Wan Optimization the Future of Data Centers?

AdobeStock_93793795

WAN, wide-area networks, may have not been prioritized in the past but more and more data center managers are closely looking at WANs as the future of data centers.  WAN optimization involves a series of techniques such as data duplication, traffic shaping, data caching, compression, network monitoring, and more in an effort to speed interconnectivity. TechTarget explains the importance of focusing on WAN moving forward, “A data center interconnect has historically replicated data from a primary data center to a disaster recovery site or backup data center. However, virtualization and cloud computing are transforming the role of a data center inter-connect, and wide area network (WAN) managers must adjust their approach to these increasingly critical WAN links… WAN managers need to understand the changing environment within data centers and prepare for an increased demand on the WAN links that interconnect multiple data centers… WAN optimization makes transfer protocols more efficient and reduces the volume of traffic through compression and deduplication.”

Every data center needs a WAN in place that is strategic and unique, carefully configured to meet the data center’s specific needs.  As remote access needs and national capability needs increase connectivity and speed demands shift and become more and more important.  When WAN is executed properly, bandwidth limitations are mitigated and access to applications improved.  We have previously discussed the shift towards data center consolidation in an effort to improve efficiency while lowering the costs of overhead and personnel while optimizing infrastructure and securing physical assets but data center consolidation means consolidating IT infrastructure as well.  With so many data centers consolidating IT infrastructure there are fewer small data centers which means further distance between the end-user and the data center and that can mean poor application performance from latency and network congestion.  With fewer but larger servers, traffic is increased and WAN optimization becomes all-the-more important.   Data center consolidation can move forward effectively through WAN optimization.  WAN optimization will only continue to grow in importance moving forward, as Data Center Knowledge notes, “This means that while the CIO is trying to exercise tighter control over the corporate wide-area network (WAN), users are expecting looser controls and the ability to access anything, anywhere, anytime with scant regard for security or the impact on network performance. Look into the usage logs of most corporations today and you will find hours spent on Facebook, Twitter and YouTube, for example.This usage is expensive. The study further concluded that social media networks could potentially be costing Britain up to $22.16 billion.The solution CIOs desire is a fully integrated single platform that delivers complete WAN optimization capabilities, the insight to allow management to keep its eye on exactly what traffic is traversing the network, and the flexibility to dynamically optimize it when and if required.”

Posted in Data Center Design, data center equipment | Comments Off

Data Centers Must Protect Against Arc-Flash

UPS maintenanceWhen you think about “protection” in a data center, you probably think about protecting critical data, protecting infrastructure, protecting uptime, etc.  But, it is also important to think about protecting data center workers.   Whether a data center is small or large, due to the large amount of electrical equipment, there are certain safety measures that must be taken to ensure worker safety.  One concern that data centers must protect against is “arc-flash.”  Data center workers are in a conundrum of sorts – to work on, or perform maintenance on, certain electrical components without risk of arc-flash, electrical power to the components must be turned off.  But, often, retaining uptime means that various electrical components cannot be shut off.  DataInformed explains what arc-flash is, and why it is such a significant concern in data centers, “An important electrical risk in the data center is arc-flash incidents. Arc-flash incidents, which are caused by arcing from an electrical fault, potentially creating a blast similar to an explosion, happen between five and 10 times a day in U.S. industry and result in one death every single workday.  Although data center design, permitting and construction are in adherence to modern electrical safety requirements, data center workers must be trained and competent, and must maintain compliance with all OSHA requirements to keep electrical safety in the data center at its current high standard.”

Not only is maximizing safety to protect against arc-flash important for peace of mind for both employer and employee, but it will help a data center remain OSHA compliant which reduces liability and cuts down on costs.  The specifics of how a data center will implement protection against arc-flash are complex and highly individualized.  Data centers that do implement best security practices, though, will ultimately improve safety and uptime.  When designing infrastructure and preparing a data center it is critical that an arc –flash analysis be completed before a data center is up and running at full capacity.   Data Center Knowledge elaborates on what is involved in an arc-flash analysis or study, ““An arc flash study looks at all the electrical components, from the source at the power company, the whole way through to the plugs that you plug into your IT equipment,” Furmanski told us in an interview.  “They look at how all the circuit breakers are set up — it’s called a coordination study — and they look at the power going through.  They punch in all these formulas to figure out, will these breakers move fast enough if there’s an electrical short, or will they move too slowly and let the capability of an arc flash be created?”  If your data center has not recently had an arc-flash analysis, or you are not sure if it ever has, it is incredibly important to complete one as soon as possible to maximize worker safety and uptime.

Posted in Computer Room Design, computer room maintenance, Data Center Construction, Data Center Design, data center equipment, Data Center Infrastructure Management, DCIM, Facility Maintenance | Tagged , , , | Comments Off