In this discussion on Bloomberg Television, experts delve into the evolving landscape of data centers, emphasizing the significant changes driven by the rise of AI. Historically, data centers were built to support ultra-low latency communications for services like YouTube, Netflix, and TikTok, primarily serving video applications. However, the current shift towards AI applications requires data centers to handle massive compute workloads, which involves solving large mathematical problems and running complex algorithms. This transformation demands a different style of data center with high power density. Traditional data centers, which operated at around seven kilowatts per rack, are now being outpaced by AI-focused centers, where a single NVIDIA A100 server can consume ten kilowatts. Consequently, modern data centers need to be designed to support these high-density power requirements and maintain ultra-low latency within the data center itself.The panelists discuss the necessity of building new data centers (Greenfield) rather than retrofitting existing ones due to the differing cooling and power density requirements. The primary challenges highlighted include securing power sources, particularly stranded power from renewable energy, and managing stretched supply chains for electrical components and high-voltage equipment.The competitive landscape in the data center industry is also examined, noting the scramble to build infrastructure capable of supporting AI workloads. Companies are partnering with technology giants like NVIDIA to design data centers that can handle the demanding compute tasks of AI. The discussion underscores the importance of future-proofing facilities to accommodate higher power densities and advanced cooling systems, ensuring that data centers can support both current and future applications without becoming obsolete.