eZDatacenter ENTERPRISE

• Applications: Deep Learning / Machine Learning / Neural Network Training, GPU noise and heat control
• Target Users: AI training or inference services, High-heat server environments, ESG-focused organizations
• Features: (★★★★★ World Best Technology)
  1. Base model features cooling units mounted on a 42U Smart Rack
  2. Cooling system consists of one integrated Air Conditioner and one Heat Exchanger
    • Or configured with 2 Air Conditioners
  3. Supports up to 4 integrated Air Conditioners (up to 80,000 BTU cooling capacity)
    • 4 AC units are primarily used for AI workloads
    • Customizable as self-contained or split-type outdoor units as needed
  4. Base model can be expanded left or right continuously (1+1, 1+N...)
  5. Fully Customizable
  6. AI Server (Artificial Intelligence Server) – Ideal for cooling high-temperature GPU heat generated during Deep Learning, Machine Learning, and Neural Network training

The Best Solution for AI Servers

The power supply capacity of a single Nvidia DGX B200 is 14kW.
Since power consumption converts directly into heat, cooling becomes absolutely essential. This is why GPU (AI) servers are so noisy.

Adding high-density racks (including AI servers) to existing data centers is practically impossible.
The reason is that while a standard rack handles 5–10kW of load, a high-density rack exceeds 50–100kW—more than 10 times the capacity.
This means not only the power infrastructure but also the entire cooling architecture must be redesigned.

Furthermore, IDC (Internet Data Center) cannot be the alternative.
IDC services (colocation, data center, server hosting, GPU hosting) charge based not only on network usage but also on power consumption (server heat generation = cooling capacity), which can vary significantly.
They are also designed to handle a maximum of 10kW per rack or less.

Therefore, the assumption that IDC leasing, colocation, data center rental, server hosting, or GPU hosting can reduce outsourcing and operational costs may no longer be valid.

As of 2025, the number of new data centers capable of accommodating high-density racks (AI-dedicated) is either extremely limited or nearly non-existent.
Ultimately, the only options are to colocate in a newly built AI-dedicated data center or build one from scratch.

When building a server room for AI servers, controlling GPU noise and heat during Deep Learning, Machine Learning, and Neural Network training is critical.
Cooling Rack Enterprise is an optimized data center solution for AI services or AI training,
representing over 10 years of accumulated technology from DOBE Computing—the best solution available.

Rack
International standardsCompliance
Rack Unit42 U Basic Model
Cooling Unit
Cooling capacity40,000 BTU (Max 80,000 BTU)
Temperature setting range68 ℉ ~ 93.2 
Use environmentLess than 113 ℉ degrees
Heat ExchangerMore than 10,000 BTU 
Controller
Rated voltage220V / 110V Single phases 
Current of consumptionMin 40A (Max 80A)
Safety featureOpen door if over temperature
Emergency switch
Option

Indoor Unit Outdoor Unit Integrated Air Conditioner

 - base model comes with 1 AC and can be equipped with up to 4 ACs

 - Heat exchangers are excluded when mounting 2 or 4 ACs

Fire extinguisher/ UPS / KVM Switch / Remote Monitoring 

Dimension
2,000(H) * 1,100 (W) * 1,300 (D)


When the trajectory of data growth and service scalability is still unclear,

AI servers and GPUs such as the Nvidia A100 come with a significant price tag—making early-stage over-investment a risk to avoid.

Start with our base model first, then scale your infrastructure as your service grows. This approach enables smart, strategic investment aligned with your actual needs.