header-about-us-sub

Are Data Center Silos Interfering With Growth?

Networking communication technology concept, network and internet telecommunication equipment in server room, data center interior with computers and Earth globe in blue light

Every business can experience information and data silos on some level, particularly when various applications and systems must communicate with each other.  But, these silos are particularly evident, costly, and problematic in data centers.  Data Center Knowledge offers one example of the type of silo that can occur in a data center, “Electrical and mechanical systems in data centers are a perfect example of legacy IoT, he says, operating in silos, isolated from the IT systems they support. That isolation is the decades-old legacy, used to this day as the only method of securing these critical systems from intrusion.”  Not only are there electrical and mechanical silos in the data center, but data/information silos as well.  As certain tools and information are used in various ways by assorted applications, data silos emerge and become increasing problematic.  So often, we see that when information silos occur, duplication of information and processes occur which takes up more space and thus uses more energy.  Silos in data centers are truly a drain on resources.

The problem of data center silos is further exacerbated by the fact that, often, one company may have multiple data centers in locations all over the world.  To avoid data center silos in both large and small data centers, there must be collaboration and open lines of communication.  Fortunately, many data centers are opting for convergence rather than expansion, finding ways to use existing space in a more effective and efficient way.  This alone, will help reduce information silos.  Data Center Knowledge explains how convergence is being actively applied in data centers and elaborates on the advantages to be found when data centers opt for convergence, “As we saw with many Datalink enterprise customers who moved from the silo model, IT data centers first began to incorporate server virtualization technology to logically represent multiple servers on one or more consolidated, physical systems with smaller data center footprints. The data centers also began to incorporate their own dedicated network and shared storage to support them.Suddenly, one physical server could be used to serve up the needs of multiple applications, which it often did with glowing results. But, a ripple effect of virtual server growth often expanded storage and network needs significantly. Suddenly, cost savings in one area could be offset by growing expenses in another…

The benefits for IT can lead to:

  • Less moving parts (and less individual vendor touch points) to manage or troubleshoot
  • Greater resource utilization at a lower cost
  • Faster application provisioning (one enterprise customer went from their prior three weeks to just 15 minutes to provision new applications)
  • Faster IT response to business priority changes or changing market conditions
  • Easier scaling and greater elasticity of the infrastructure
  • Related integration and cross-training of previously siloed IT teams, themselves, in order to align IT further to the business
  • A shorter pathway to on-demand services or private cloud environments to meet the IT needs of internal business units

A lower cost to support the growing data and application needs of the business (Another enterprise customer who runs its own SaS business found itself able to offer better quality services to current and new customers at a lower overall cost to itself.)”

Posted in Data Center Build, Data Center Design | Comments Off

Should Enterprises Keep IT In House or Outsource Colocation?

Data center with network servers in futuristic room.

Enterprise data centers may be a dying breed.  Today we are seeing more and more data centers opt for colocation over enterprise data centers because of the high cost and level of expertise needed to run an enterprise data center. Additionally, cloud service providers are mitigating the need for enterprise data centers.  Data Center Journal explains the basic appeal of colocation to enterprises, “Many businesses don’t have the time and money to invest in the equipment, technology, security and staff to run a full data center. For those businesses, colocation can help them optimize their department and free up resources, giving employees the time and bandwidth to focus on more strategic business tasks. Colocation facilities provide the space, cooling, power and security for your server, storage and networking equipment, while giving IT manager’s access to high bandwidth, low latency and always-on connections… Large colocation facilities also offer significant benefits of scale. By utilizing large power and mechanical systems, the facility can provide high uptimes and speed as well as the ability to efficiently appropriate additional resources and quickly grow alongside your company.”

Colocation is certainly not free, it comes at a cost (particularly at the beginning) and provides its own set of security risks.  But, while enterprise data centers will never completely disappear but they are certainly fading, and rapidly.  Data Center Knowledge discusses the trend of moving away from enterprise data centers towards colocation and cloud services, “As Liz Cruz, associate director with the market research firm IHS and the panel’s moderator, pointed out, hardware and infrastructure equipment sales into data centers are declining, while revenue colocation providers are raking in is growing in double digits, which means more and more companies choose outsourcing over their own data centers. Still, when she asked people in the audience to raise their hands if their companies had at least two-thirds of their IT capacity in colocation data centers, only a handful did. It’s cloud providers who are driving a lot of the revenue growth for colo companies – a lot more than enterprises, although enterprise data center spending is slowly waning. “Cloud providers are now the largest tenant of multitenant data center facilities,” Cruz said… For colocation providers, these hard-nosed enterprise users are not only a big growth opportunity; it’s a matter of longevity. The race to capture the hearts and minds of the enterprise is on, but they’re not only racing each other. They’re also racing the Amazon, Microsoft, and a few others. Most colo providers have embraced public cloud as reality and have been using their ability to provide direct network access to cloud services from their facilities as a way to attract enterprises, pitching customers on the hybrid cloud, where a physical footprint the customer has full control of is supplemented with public cloud services, all under one roof in a colocation facility.”  If the trend continues, as we suspect it will, colocation data centers will continue to grow and work towards integration with cloud services to draw more and more businesses away from enterprise data centers.

 

 

Posted in Cloud Computing, computer room maintenance, Data Center Build | Tagged , , , | Comments Off

Data in Motion vs. Data at Rest

Business Data Centers
The tech industry loves to uses catchy phrases to describe various processes, innovations and aspects in data centers.  Every now and then, we think it is important to narrow in on those phrases and explore what they mean and how they impact data center operations.  One of those phrases is “data in motion vs. data at rest.”  Data in motion vs. data at rest are somewhat self-explanatory but their nuances and impact on data centers are not.  Data in motion is data that is actively being used by data centers, it is data in transit.  Data at rest is data that is not being actively used but is stored in a data center. These two different types of data present unique security challenges.  For example, data in motion may be in transit on the internet and that presents different security challenges than data at rest that, while not actively in use, may contain sensitive customer information.

When it comes to securing data of any kind there are a variety of ways to prevent security breaches and cyber-attacks.  Encryption is certainly a must and data at rest must be secured on a number of levels because it may be stored in multiple places, including databases, storage networks, file servers, or virtually in the cloud.  Data in motion is exposed to cyber-attacks at a number of points while in transit.  DataMotion explains what risks may be encountered and describes some best practices when it comes to securing data, “Data is at rest when it is stored on a hard drive. In this relatively secure state, information is primarily protected by conventional perimeter-based defenses such as firewalls and anti-virus programs. However, these barriers are not impenetrable. Organizations need additional layers of defense to protect sensitive data from intruders in the event that the network is compromised. Encrypting hard drives is one of the best ways to ensure the security of data at rest. Other steps can also help, such as storing individual data elements in separate locations to decrease the likelihood of attackers gaining enough information to commit fraud or other crimes…Data is at its most vulnerable when it is in motion, and protecting information in this state requires specialized capabilities. Our expectation of immediacy dictates that a growing volume of sensitive data be transmitted digitally— forcing many organizations to replace couriers, faxes, and conventional mail service with faster options such as email. Looking ahead, it will also become increasingly important for the encryption service your organization uses to cover mobile email applications. The Radicati Group1 predicts that 80% of email users will access their accounts via mobile devices by 2018, but more than 35% of organizations currently using email encryption say their users currently lack the ability to send secure messages from their mobile email client.”  Securing data will be a challenge that will never leave, data centers must look forward, anticipate potential future threats, and use multi-level encryption to ensure that data in motion and data at rest remain protected.

 

 

Posted in Data Center Infrastructure Management, Data Center Security | Tagged | Comments Off

The Internet of Things and How It Impacts Data Centers

97806468

From time to time we see new “catchphrases” or terminology pop up in the tech world and suddenly they are being used everywhere.  One of these phrases is “the internet of things.”  As we see our world become increasing automated, digitized and more, we see that many aspects of our day to day life are now controlled by or taking place on the internet.  Forbes explains exactly what “the internet of things” means, “Simply put, this is the concept of basically connecting any device with an on and off switch to the Internet (and/or to each other). This includes everything from cellphones, coffee makers, washing machines, headphones, lamps, wearable devices and almost anything else you can think of.  This also applies to components of machines, for example a jet engine of an airplane or the drill of an oil rig. As I mentioned, if it has an on and off switch then chances are it can be a part of the IoT.  The analyst firm Gartner says that by 2020 there will be over 26 billion connected devices… That’s a lot of connections (some even estimate this number to be much higher, over 100 billion).  The IoT is a giant network of connected “things” (which also includes people).  The relationship will be between people-people, people-things, and things-things.”

The reality is, whether people like it or not, the internet of things is taking over and our world is being powered by the internet.  This impacts our day to day life in different ways but one thing data centers know is that it means more data.  Everything that is becoming part of the internet of things involves data.  That coffee machine that is connected to the internet is going to require data communication and storage on some small level.  And, if a jet engine is connected, it is probably using a lot of data.  As more things become part of the collection of the internet of things, data center demands exponentially increase. Data centers must begin to prepare now because as tie marches on, the internet of things will only increase. Data Center Dynamics points out just how much this will impact data centers going forward, “The internet of things will force enterprise data center operators to completely rethink the way they manage capacity across all layers of the IT stack, according to a recent report by the market research firm Gartner… Where this becomes problematic for data centers is management of security, servers, storage and network, Joe Skorupa, VP and distinguished analyst at Gartner, said. “Data center managers will need to deploy more forward-looking capacity management in these areas to be able to proactively meet the business priorities associated with IoT,” he said in a statement.  For data center networks, the internet of things will basically mean a lot more incoming traffic. WAN links in data centers today are designed for “moderate” bandwidth requirements of human interaction with applications.Data from multitudes of sensors will require a lot more bandwidth than current capacity… Of course a lot more data will mean a lot more storage will have to be provisioned in data centers. In addition to pure capacity, companies will have to focus on being able to get and use data generated by the internet of things cost effectively.Because of the volume of data and the amount of network connections that carry it, there will be more need for distributed data center management and appropriate system management platforms.”  While end-users may take for granted the convenience of the internet of things, data centers do not have the luxury of taking it for granted.  They must be vigilant in preparation and expansion to ensure they can accommodate the dramatically growing data needs that the internet of things presents.

Posted in Internet of Things | Comments Off

The Cost of a Data Center Security Breach

datacentersecurity2

If there is one thing that a data center is concerned with, aside from maximizing uptime, it is security.  In today’s world we constantly hear news stories about security breaches exposing businesses and individuals to danger such as identity theft, information loss, other theft, and more.  Security breaches are not just an embarrassing frustration; they are a costly one as well.  Large businesses can obviously suffer significant losses but the losses experienced by small and medium-sized businesses are significant as well. Security Intelligence describes the growing risk of security breaches, “Every corner of the organization — from human resources to operations to marketing — is generating, acquiring, processing, storing and sharing more data every day. Cybersecurity threats have conditioned organizations to defend the full depth of this sensitive information and infrastructure from a global threat landscape…IBM and Ponemon Institute are pleased to release the “2015 Cost of Data Breach Study: Global Analysis.” According to our research, the average total cost of a data breach for the participating companies increased 23 percent over the past two years to $3.79 million.”  This growing problem and increasing cost is a clear signal that data centers and business must all pay careful attention to security measures to ensure that data is properly protected.

While upping your security protection will certainly involve an up-front investment, if you are protecting critical information such as health, financial, social security, or other top secret records, the cost of a breach will be far more than the cost of protection.  For example, cloud security may be ideal for less critical information (i.e. social media) but higher security protection is better for more sensitive information. If you have a smaller business and data center, you may think the risk of a security breach is smaller, statistics are showing that, in general, security breaches are a growing reality for many.  Data Center Knowledge points out the frequency of security breaches,” Roughly half of businesses in the U.S. (49 percent) and globally (52 percent) assume that their IT security will be breached sooner or later. This is a recognition of reality, as 77 percent of U.S. businesses and 82 percent globally have experienced between 1 and 5 separate data security incidents in the last year.”  Data Center Knowledge also notes that smaller and medium businesses that experience a security breach typically incur a loss, on average, of $86,500.  And the cost of liability for data breaches is growing, emphasizing the importance of protecting their customer’s and user’s private information, which Data Center Knowledge points out, “There’s legislation brewing that would make organizations far more accountable for breaches of personal information and require them to pay actual damages to individuals, something he thinks will reverse the trend toward cloud and colocation back to in-house.”

Posted in Data Center Security | Comments Off

Advantages of Data Center Consolidation

Trouble in data center

In today’s data center world, there is a lot of discussion over increasing rack density, utilizing the space you have without having to relocate, and more.  Working with the space you have to accommodate growing data centers needs and increasing infrastructure demands may require some creative thinking but consolidation can be extremely beneficial.  Data center square footage does not come at a cheap price and running large data centers or multiple data centers uses a lot of energy and manpower.  Because of this, many data center managers are looking more closely at ways they can consolidate within their data center, or within their network of data centers, to save on the cost of overhead and energy use.

First, it is important to look at organizations that have multiple data centers.  This can happen as a result of businesses acquiring other organizations that have existing data centers in place, or it can happen from gradual expansion of needs.  During growth, it can seem or even actually be less expensive to simply keep those additional data centers open but, in the long run, it will not be.  Separate data centers require separate energy usage, separate rent/mortgage, separate personnel, separate infrastructure and more.  Those things add up over time and often what businesses find is that there are unnecessary redundancies that can be improved and solved with consolidation.  The obvious concern with consolidation is downtime.  Downtime can lead to loss of critical data, loss of money, and general frustration.  Data Center Knowledge explains why consolidation is often the better choice, and what three areas to look at when beginning to consolidate, “In many cases, creating better efficiency and a more competitive data center revolves around consolidating data center resources. With that in mind, we look at three key areas that managers should look at when it comes to data center consolidation. This includes your hardware, software, and the users… There are so many new kinds of tools we can use to consolidate services, resources, and physical data center equipment. Solutions ranging for advanced software-defined technologies to new levels of virtualization help create a much more agile data center architecture… The software piece of the data center puzzle is absolutely critical. In this case, we’re talking about management and visibility. How well are you able to see all of your resources? What are you doing to optimize workload delivery? Because business is now directly tied to the capabilities of IT, it’s more important than ever to have proactive visibility into both the hardware and software layers of the modern data center.Having good management controls spanning virtual and physical components will allow you control resources and optimize overall performance… Data center consolidation must never negatively impact the user experience. Quite the opposite; a good consolidation project should actually improve overall performance and how the user connects. New technologies allow you to dynamically control and load-balance where the user gets their resources and data. New WAN control mechanisms allow for the delivery or rich resources from a variety of points. For the end-user, the entire process is completely transparent. For the data center, you have less resource requirements by leveraging cloud, convergence, and other optimization tools.”  Every data center that has grown over time or has a network of data centers should carefully consider where consolidation can occur to save money, improve efficiency and improve overall quality of service.

Posted in Data Center Design, Data Center Infrastructure Management | Comments Off

Is Wan Optimization the Future of Data Centers?

AdobeStock_93793795

WAN, wide-area networks, may have not been prioritized in the past but more and more data center managers are closely looking at WANs as the future of data centers.  WAN optimization involves a series of techniques such as data duplication, traffic shaping, data caching, compression, network monitoring, and more in an effort to speed interconnectivity. TechTarget explains the importance of focusing on WAN moving forward, “A data center interconnect has historically replicated data from a primary data center to a disaster recovery site or backup data center. However, virtualization and cloud computing are transforming the role of a data center inter-connect, and wide area network (WAN) managers must adjust their approach to these increasingly critical WAN links… WAN managers need to understand the changing environment within data centers and prepare for an increased demand on the WAN links that interconnect multiple data centers… WAN optimization makes transfer protocols more efficient and reduces the volume of traffic through compression and deduplication.”

Every data center needs a WAN in place that is strategic and unique, carefully configured to meet the data center’s specific needs.  As remote access needs and national capability needs increase connectivity and speed demands shift and become more and more important.  When WAN is executed properly, bandwidth limitations are mitigated and access to applications improved.  We have previously discussed the shift towards data center consolidation in an effort to improve efficiency while lowering the costs of overhead and personnel while optimizing infrastructure and securing physical assets but data center consolidation means consolidating IT infrastructure as well.  With so many data centers consolidating IT infrastructure there are fewer small data centers which means further distance between the end-user and the data center and that can mean poor application performance from latency and network congestion.  With fewer but larger servers, traffic is increased and WAN optimization becomes all-the-more important.   Data center consolidation can move forward effectively through WAN optimization.  WAN optimization will only continue to grow in importance moving forward, as Data Center Knowledge notes, “This means that while the CIO is trying to exercise tighter control over the corporate wide-area network (WAN), users are expecting looser controls and the ability to access anything, anywhere, anytime with scant regard for security or the impact on network performance. Look into the usage logs of most corporations today and you will find hours spent on Facebook, Twitter and YouTube, for example.This usage is expensive. The study further concluded that social media networks could potentially be costing Britain up to $22.16 billion.The solution CIOs desire is a fully integrated single platform that delivers complete WAN optimization capabilities, the insight to allow management to keep its eye on exactly what traffic is traversing the network, and the flexibility to dynamically optimize it when and if required.”

Posted in Data Center Design, data center equipment | Comments Off

Data Centers Must Protect Against Arc-Flash

UPS maintenanceWhen you think about “protection” in a data center, you probably think about protecting critical data, protecting infrastructure, protecting uptime, etc.  But, it is also important to think about protecting data center workers.   Whether a data center is small or large, due to the large amount of electrical equipment, there are certain safety measures that must be taken to ensure worker safety.  One concern that data centers must protect against is “arc-flash.”  Data center workers are in a conundrum of sorts – to work on, or perform maintenance on, certain electrical components without risk of arc-flash, electrical power to the components must be turned off.  But, often, retaining uptime means that various electrical components cannot be shut off.  DataInformed explains what arc-flash is, and why it is such a significant concern in data centers, “An important electrical risk in the data center is arc-flash incidents. Arc-flash incidents, which are caused by arcing from an electrical fault, potentially creating a blast similar to an explosion, happen between five and 10 times a day in U.S. industry and result in one death every single workday.  Although data center design, permitting and construction are in adherence to modern electrical safety requirements, data center workers must be trained and competent, and must maintain compliance with all OSHA requirements to keep electrical safety in the data center at its current high standard.”

Not only is maximizing safety to protect against arc-flash important for peace of mind for both employer and employee, but it will help a data center remain OSHA compliant which reduces liability and cuts down on costs.  The specifics of how a data center will implement protection against arc-flash are complex and highly individualized.  Data centers that do implement best security practices, though, will ultimately improve safety and uptime.  When designing infrastructure and preparing a data center it is critical that an arc –flash analysis be completed before a data center is up and running at full capacity.   Data Center Knowledge elaborates on what is involved in an arc-flash analysis or study, ““An arc flash study looks at all the electrical components, from the source at the power company, the whole way through to the plugs that you plug into your IT equipment,” Furmanski told us in an interview.  “They look at how all the circuit breakers are set up — it’s called a coordination study — and they look at the power going through.  They punch in all these formulas to figure out, will these breakers move fast enough if there’s an electrical short, or will they move too slowly and let the capability of an arc flash be created?”  If your data center has not recently had an arc-flash analysis, or you are not sure if it ever has, it is incredibly important to complete one as soon as possible to maximize worker safety and uptime.

Posted in Computer Room Design, computer room maintenance, Data Center Construction, Data Center Design, data center equipment, Data Center Infrastructure Management, DCIM, Facility Maintenance | Tagged , , , | Comments Off

3 Data Center Outages That Are Preventable

Trouble in data center
Data centers function with a continuous goal of maximizing uptime.  It is important to avoid outages at all cost while constantly trying to improve energy efficiency and maximize data storage and speed.  There are a variety of factors that influence data center outages but the bottom line is that, from time to time, they do happen.  The problem is that, when outages occur, they are not only frustrating; they can result in data loss and significant financial loss.  So, what is a data center to do?  Are these outages simply unavoidable, aggravating occurrences?  No.  In fact, Emerson Network Power notes just how preventable these outages can be, “According to the 2013 Study on Data Center Outages by the Ponemon Institute, sponsored by Emerson Network Power, 71% of survey respondents said some or all of unplanned outages experienced within the last 24 months were preventable.”  Below, we discuss 2 common types of data center outages that are, by and large, preventable.

  1. Human Error
    • Human error is, unfortunately, one of the most highly cited reasons for data center outages. This can be avoided with simple measures such as shielding “emergency off” buttons.  Emergency Power Off buttons are often not labeled correctly or protected properly and by simply shielding and labeling them, data center outages can be avoided.  Additionally, well-communicated operating instructions and procedure methods can help reduce errors that occur from lack of information or knowledge.  Finally, what may seem like a no-brainer – strict food and drink policies.  Even a small liquid or food spill on critical equipment could lead to an outage so it is important to have strict regulations in place.
  2. UPS/Battery Failure
    • Power supplies can fail for a number of reasons – age, local power outages, storms, surges, and more. For this reason it is critical that an uninterruptible power supply be used but, perhaps even more importantly, it is necessary to have redundancy.  Have a power supply that is adequate size for your entire capacity and power load, as well as a backup power supply that is also adequate and be certain to perform proper UPS and battery maintenance routinely.  Green House Data describes the importance of a proper DCIM, “As data centers become more and more dense, they are drawing more power at each rack. Don’t allow your UPS design to fall below your average IT load. A Data Center Infrastructure Management (DCIM) platform can help you evaluate power draw throughout a given period. Redundant UPS systems are also a necessity to achieve the goal of 100% uptime.”

 

Posted in computer room maintenance, Data Center Battery, Data Center Design, data center equipment, Data Center Infrastructure Management, data center maintenance, DCIM, Facility Maintenance, Uninterruptible Power Supply, UPS Maintenance | Tagged , , , , , | Comments Off

The Future of the Data Center: Scale

97806468What will the data center look like in 5 years or even 10 years?  It may sound impossible to predict but experts are weighing in and providing their predictions for the future of data centers.  The storage systems and servers of today will be a distant memory.  Cloud computing will take on a whole new life.  While 5 or 10 years may sound far off, it is important for the data centers of today to start anticipating these changes and preparing for the future so that they can stay ahead of the game and not fall by the wayside.  Storage needs are changing daily so it is easy to understand that they will be significant in the future.  Many experts see data centers making the switch to being scale data centers by 2025.  Data Centers Knowledge elaborates on what “scale data centers” are, “Scale data centers are data centers designed the same way web giants like Google, Microsoft, and Facebook design their facilities and IT  systems today. Intel isn’t saying most data centers will be the size of Google or Facebook data centers, but it is saying that most of them will be designed using the same principles, to deliver computing at scale.”

Delivering computing at scale is not a simple concept or an easily achievable task but it is necessary to meet the expected demands of technology and users of the future.  Data Center Knowledge goes on to explain the future demands that will necessitate scale data centers, “Things like the three major forms of cloud computing (IT infrastructure, platform, or software delivered as subscription services), connected cars, personalized healthcare, and so on, all require large scale. “If you’re doing a connected-car type of solution, that’s not a small-scale type of deployment,” Waxman said. “If you’re doing healthcare and you’re trying to do personalized medicine, that’s a large-scale deployment.’”  As data volumes increase, data centers must be able to scale non-disruptively.  For data centers, infrastructure must be carefully managed to be capable of scaling up on demand.  The costs to meet these demands can be managed more easily by gradually scaling up data centers.  Schneider Electric also notes that scale will be the future of data centers, ““We’ll see a dominance of at scale wholesale data centers, with a movement towards at scale cloud providers and the verticalization and specialization of the smaller providers in between,” he says. “There will also be a secondary movement to the edge.” He defines “at scale” as at least 15MW or more, a size needed to support cost effective IoT and big data deployments — two of the drivers changing the market according to Doug. “Big data, derived in large from the IoT, is helping shape the way companies develop, improve and bring products to market and serve consumers and customers,” said Doug, “Ultimately, all that data resides in a data center where there must be enough power to process and analyze it.”

Posted in Cloud Computing, Computer Room Design, Data Center Build, Data Center Design, data center equipment, Data Center Infrastructure Management, DCIM | Tagged , , , | Comments Off